Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
124 changes: 110 additions & 14 deletions content/en/observability_pipelines/destinations/microsoft_sentinel.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,21 +23,117 @@
1. [Add Microsoft Sentinel][6] to the workspace.
1. [Create a Data Collection Endpoint (DCE)][7].
1. [Create a Log Analytics Workspace][8] in the workspace if you haven't already.
1. In the Log Analytics Workspace, navigate to **Settings** > **Tables**.
1. Click **+ Create**.
1. Define a custom table (for example, `MyOPWLogs`).
- **Notes**:<br>- After the table is configured, the prefix `Custom-` and suffix `_CL` are automatically appended to the table name. For example, if you defined the table name in Azure to be `MyOPWLogs`, the full table name is stored as `Custom-MyOPWLogs_CL`. You must use the full table name when you set up the Observability Pipelines Microsoft Sentinel destination.<br>-The full table name can be found in the resource JSON of the DCR under `streamDeclarations`.<br>- You can also use an Azure Table instead of a custom table.
1. Select **New Custom Log (DCR-based)**.
1. Click **Create a new data collection rule** and select the DCE you created earlier.
1. Click **Next**.
1. Upload a sample JSON Log. For this example, the following JSON is used for the **Schema and Transformation**, where `TimeGenerated` is required:
```json
1. Follow the instructions for the type of table to which you want to send data.
{{< tabs >}}
{{% tab "Azure Table" %}}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we support all native Azure Tables or is there a limitation, say by MS Sentinel?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't manually test all standard tables but I see no reason why this wouldn't work.
@p-parekh do you have specific tables you want to test ?

1. Create a JSON file for your Data Collection Rule (DCR) parameters. See [Data collection rule (DCR)][1] for more information.
- In the `streamDeclarations` property, you must list all log fields you want mapped to the corresponding Azure table column. See [Stream declarations][2] for more information.
- In the `transformKql` property, you must list all fields on the log that are dropped and not mapped to the table. See [Data flow properties][3] for more information.
- **Note**: Each log field must be listed in one of these properties: either `streamDeclarations` or `transformKql`; otherwise the log is dropped. See [Monitor DCR data collection in Azure Monitor][4] on how to set up an alert when logs are dropped.
- For example, this JSON file (`dcr-commonsecuritylog.json`) adds the log fields to be mapped to the [`CommonSecurityLog`][5] table:
```bash
{
"TimeGenerated": "2024-07-22T11:47:51Z",
"event": {}
}
```
1. Click **Create**.
"location": "eastus",
"kind": "Direct",
"properties": {
"dataCollectionEndpointId": "<DCE_RESOURCE_ID>",
"streamDeclarations": {
"Custom-CommonSecurityLog": {
"columns": [
{ "name": "TimeGenerated", "type": "datetime" },
{ "name": "DeviceVendor", "type": "string" },
{ "name": "DeviceProduct", "type": "string" },
{ "name": "DeviceVersion", "type": "string" },
{ "name": "DeviceEventClassID", "type": "string" },
{ "name": "Activity", "type": "string" },
{ "name": "LogSeverity", "type": "string" },
{ "name": "SourceIP", "type": "string" },
{ "name": "DestinationIP", "type": "string" },
{ "name": "Message", "type": "string" },
{ "name": "source_type", "type": "string" },
{ "name": "path", "type": "string" },
{ "name": "timestamp", "type": "string" }
Comment thread
maycmlee marked this conversation as resolved.
]
}
},
"destinations": {
"logAnalytics": [
{
"workspaceResourceId": "<WORKSPACE_RESOURCE_ID>",
"name": "LogAnalyticsDest"
}
]
},
"dataFlows": [
{
"streams": ["Custom-CommonSecurityLog"],
"destinations": ["LogAnalyticsDest"],
"transformKql": "source | project-away source_type, path, timestamp",
Comment thread
maycmlee marked this conversation as resolved.
"outputStream": "Microsoft-CommonSecurityLog"
}
]
}
```
- Replace the placeholders:
- `<DCE_RESOURCE_ID>` with the ID of the DCE resource you created in step 2. Run the [`az monitor data-collection endpoint show`][9] command to get the DCE resource ID. For example:
```
az monitor data-collection endpoint show \
--name "<DCE_NAME>" \
--resource-group <RESOURCE_GROUP> \
--subscription <SUBSCRIPTION_ID> \
--query "id"
```
- `<WORKSPACE_RESOURCE_ID>` with the ID of the Logs Analytics Workspace you created in step 3. Run the [`az monitor log-analytics workspace show`][10] command to get the Workspace resource ID. For example:
```
az monitor log-analytics workspace show \
--workspace-name "<DCE_NAME>" \
--resource-group <RESOURCE_GROUP> \
--subscription <SUBSCRIPTION_ID> \
--query "id"
```

- See [CommonSecurityLog Columns][6] for a full list of `commonsecuritylog` table columns.
- See the [Supported Azure Tables][7] for all available tables to which you can send data.
1. Run the [`az monitor data-collection rule create`][8] Azure CLI command to create a DCR with the JSON file you created in the previous step. For example, with the `dcr-commonsecuritylog.json` example file:
```bash
az monitor data-collection rule create \
--resource-group "myResourceGroup" \
--location "eastus" \
--name "myCollectionRule" \
--subscription "mysubscription" \
--rule-file "\path\to\json\dcr-commonsecuritylog.json"
```

[1]: https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview#data-collection-rule-dcr
[2]: https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-rule-structure#stream-declarations
[3]: https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-rule-structure#data-flow-properties
[4]: https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-monitor
[5]: https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/commonsecuritylog
[6]: https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/commonsecuritylog#columns
[7]: https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview#supported-tables
[8]: https://learn.microsoft.com/en-us/cli/azure/monitor/data-collection/rule?view=azure-cli-latest#az-monitor-data-collection-rule-create
[9]: https://learn.microsoft.com/en-us/cli/azure/monitor/data-collection/endpoint?view=azure-cli-latest#az-monitor-data-collection-endpoint-show
[10]: https://learn.microsoft.com/en-us/cli/azure/monitor/log-analytics/workspace?view=azure-cli-latest#az-monitor-log-analytics-workspace-show

{{% /tab %}}
{{% tab "Custom table" %}}
1. In the Log Analytics Workspace, navigate to **Settings** > **Tables**.
Copy link
Copy Markdown
Contributor Author

@maycmlee maycmlee Mar 16, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For docs reviewer: Existing content was moved into this tab. No net-new content here.

1. Click **+ Create**.
1. Define a custom table (for example, `MyOPWLogs`).
- **Notes**:<br>- After the table is configured, the prefix `Custom-` and suffix `_CL` are automatically appended to the table name. For example, if you defined the table name in Azure to be `MyOPWLogs`, the full table name is stored as `Custom-MyOPWLogs_CL`. You must use the full table name when you set up the Observability Pipelines Microsoft Sentinel destination.<br>-The full table name can be found in the resource JSON of the DCR under `streamDeclarations`.
1. Select **New Custom Log (DCR-based)**.
1. Click **Create a new data collection rule** and select the DCE you created earlier.

Check warning on line 125 in content/en/observability_pipelines/destinations/microsoft_sentinel.md

View workflow job for this annotation

GitHub Actions / vale

Datadog.words_case_insensitive

Use 'create a' or 'create an' instead of 'Create a new'.
1. Click **Next**.
1. Upload a sample JSON Log. For this example, the following JSON is used for the **Schema and Transformation**, where `TimeGenerated` is required:
```json
{
"TimeGenerated": "2024-07-22T11:47:51Z",
"event": {}
}
```
1. Click **Create**.
{{% /tab %}}
{{< /tabs >}}
1. In Azure, navigate to **Microsoft Entra ID**.
1. Click **Add** > **App Registration**.
1. Click **Create**.
Expand Down
Loading