Collecting System & Application events using Azure Monitor Agent

This blog will give you insight on how to setup collection of System & Application Events (eventlog) from Windows devices using Azure Monitor Agent (AMA).

This blog-post is part of a series of blog posts to master Azure logging in depth (overview).

To get you started, you can find ARM-templates & scripts in my AzureLogLibrary (github). Details will be covered in the articles.


Quick Links

How to start collecting System & Application events using ARM-template ?
How to setup in GUI
High-level architecture / flow
Data transformation example
Tutorial – How to make data transformations using Data Collection Rules?
Data transformation architecture
High-level steps to do data transformation
Step 1: Kusto command must be defined
Escape characters in advanced queries
Step 2: Deploy a new DCR
Step 3a: Data transformation using AzLogIngestPS
Step 3b: Adding the TransformKql using REST API and Powershell (alternative method)
Step 4: Verification of changes in DCR (optional)
Step 5: Associate the DCR rule to the machine(s)
Real-life examples of effect of transformations
Example 1 – removing specific SecurityEvent (5145) from a particular server
Example 2 – removing syslog traffic

How to start collecting System & Application events using ARM-template ?

To get you started, check out the ARM-template on my Github

I do also provide more documentation and powershell script to deploy more DCRs

Lastly, I am also providing ‘Deploy to Azure’ shortcuts, based on the mentioned ARM-templates so you can deploy samples into your environment.


How to setup in GUI ?

Azure Data Collection Rules (DCR) is part of Azure Monitor.

You don’t need to associate the DCR rule with any resources when you are just setting up the DCR rule. You can do this later.

Some data sources require a data collection endpoint (txt-logs, iis-logs, log ingestion api). This is covered in another blog-post

You can just click Next

If you need to associate devices with the DCR after you have configured it, just find the DCR and associate the devices as shown below.

High-level architecture / flow

All data will be sent into the Azure LogAnalytics standard table, Event

DCR contains the following dataSources section, if you want to collect all system & application events

xPathQueries": [
                        "Application!*[System[(Level=1 or Level=2 or Level=3 or Level=4 or Level=0 or Level=5)]]",
                        "System!*[System[(Level=1 or Level=2 or Level=3 or Level=4 or Level=0 or Level=5)]]"
{
    "properties": {
        "description": "Data collection rule for VM Insights.",
        "immutableId": "dcr-b8358859f9a9400b933afcefd8aa1c1b",
        "dataSources": {
            "windowsEventLogs": [
                {
                    "streams": [
                        "Microsoft-Event"
                    ],
                    "xPathQueries": [
                        "Application!*[System[(Level=1 or Level=2 or Level=3 or Level=4 or Level=0 or Level=5)]]",
                        "System!*[System[(Level=1 or Level=2 or Level=3 or Level=4 or Level=0 or Level=5)]]"
                    ],
                    "name": "eventLogsDataSource"
                }
            ]
        },
        "destinations": {
            "logAnalytics": [
                {
                    "workspaceResourceId": "/subscriptions/fce4f282-fcc6-43fb-94d8-bf1701b862c3/resourcegroups/rg-logworkspaces/providers/microsoft.operationalinsights/workspaces/log-platform-management-srvnetworkcloud-p",
                    "workspaceId": "b7d80924-d55d-4bf6-b2b3-9889301c7114",
                    "name": "DataCollectionEventSysApp"
                }
            ]
        },
        "dataFlows": [
            {
                "streams": [
                    "Microsoft-Event"
                ],
                "destinations": [
                    "DataCollectionEventSysApp"
                ]
            }
        ],
        "provisioningState": "Succeeded"
    },

Data transformation example

If you want to do exclusions of specific events, you can do this with a transformation in Kusto syntax.

 "transformKql": "source | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662) and (EventID != 4688)",

Here is a complete DCR doing collections of all events, but excluding specific events by a transformation

{
    "properties": {
        "immutableId": "dcr-41624e73a1ec47c5a07edc23b41e938b",
        "dataSources": {
            "windowsEventLogs": [
                {
                    "streams": [
                        "Microsoft-Event"
                    ],
                    "xPathQueries": [
                        "Application!*[System[(Level=1 or Level=2 or Level=3 or Level=4 or Level=0 or Level=5)]]",
                        "System!*[System[(Level=1 or Level=2 or Level=3 or Level=4 or Level=0 or Level=5)]]"
                    ],
                    "name": "eventLogsDataSource"
                }
            ]
        },
        "destinations": {
            "logAnalytics": [
                {
                    "workspaceResourceId": "/subscriptions/fce4f282-fcc6-43fb-94d8-bf1701b862c3/resourceGroups/rg-logworkspaces/providers/Microsoft.OperationalInsights/workspaces/log-platform-management-srvnetworkcloud-p",
                    "workspaceId": "b7d80924-d55d-4bf6-b2b3-9889301c7114",
                    "name": "DataCollectionEvent"
                }
            ]
        },
        "dataFlows": [
            {
                "streams": [
                    "Microsoft-Event"
                ],
                "destinations": [
                    "DataCollectionEvent"
                ],
                "transformKql": "source | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662) and (EventID != 4688)",
                "outputStream": "Microsoft-Event"
            }
        ],
        "provisioningState": "Succeeded"
    },
    "location": "westeurope",
    "kind": "Windows",
    "id": "/subscriptions/fce4f282-fcc6-43fb-94d8-bf1701b862c3/resourceGroups/rg-logworkspaces/providers/Microsoft.Insights/dataCollectionRules/dcr-ingest-exclude-security-eventid",
    "name": "dcr-ingest-exclude-security-eventid",
    "type": "Microsoft.Insights/dataCollectionRules",
    "etag": "\"ba004029-0000-0d00-0000-640e2bff0000\"",
    "systemData": {
        "createdBy": "mok@2linkit.net",
        "createdByType": "User",
        "createdAt": "2022-12-23T07:56:19.0759102Z",
        "lastModifiedBy": "mok@2linkit.net",
        "lastModifiedByType": "User",
        "lastModifiedAt": "2023-03-12T19:46:06.6129653Z"
    }
}

Tutorial – How to make data transformations using Data Collection Rules?

The below text is embedded from another blog post

This section will show you the steps for setting up data transformations – and how you can do the transformation using AzLogDcrIngestPS (my new Powershell module) – or using REST API commands in Powershell

If you want to read about transformation in depth, you can find more details here

You can also get inspired of real-life samples (and their effects) of using transformation to reduce costs.

Quick Links

Data transformation architecture
High-level steps to do data transformation
Step 1: Kusto command must be defined
Escape characters in advanced queries
Step 2: Deploy a new DCR
Step 3a: Data transformation using AzLogIngestPS
Step 3b: Adding the TransformKql using REST API and Powershell (alternative method)
Step 4: Verification of changes in DCR (optional)
Step 5: Associate the DCR rule to the machine(s)
Real-life examples of effect of transformations
Example 1 – removing specific SecurityEvent (5145) from a particular server
Example 2 – removing syslog traffic

Data transformation architecture

Data transformations runs in the Azure Data Ingestion Pipeline and happens very fast as data is being uploaded.

Data transformations are defined in the transformKql property in the DCR section dataFlows.

dataFlows": [
            {
                "streams": [
                    "Microsoft-SecurityEvent"
                ],
                "destinations": [
                    "DataCollectionEvent"
                ],
                "transformKql": "source | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662) ",
                "outputStream": "Microsoft-SecurityEvent"
            }

High-level steps to do data transformation

The 5 steps to add a data transformations are:

  1. Kusto command must be defined to understand the syntax to exclude the data
  2. Deploy a new DCR
  3. Add the transformKql to the DCR – I am covering 2 methods for this in the article (AzLogIngestPS and REST API)
  4. Verification of changes in DCR (optional)
  5. Assign the DCR to the machine(s) where the transformation must happen

In the example below, I am doing a transformation to remove data. Another example is to add new columns, based on incoming data.

I hope the below example gives you the insight to understand how to work with transformation.

Step 1: Kusto command must be defined

Transformations defines which data should be sent through the pipeline.

If you want to exclude specific events, you will instruct the transformation to exclude these events using standard Kusto – except you will refer to the tablename as source

source where (EventID != 4662)

Below are 4 samples

SecurityEvent | where EventID != 5145Here I want to see all Security event except for EventID = 5145
SecurityEvent | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662)Here I want to see all Security events except for EventID = 8002,5058,4662
Event | where ( (EventID != 10016 and EventLog == “Application”)  )Here I want to see all Event system and application events, except for application events with eventid 10016
CommonSecurityLog | where (DeviceVendor !contains “sonicwall”) or ((DeviceVendor contains “sonicwall”) and (Activity contains “connection opened” or Activity contains “connection closed”) and (Protocol != “udp/dns”))Here I want to see all CEF/syslog where devicevendor is different from sonicwall like Cisco and all sonicwall events, except if protocol is udp/dns

Since the transformation is applied to each record individually, it can’t use any KQL operators that act on multiple records. Only operators that take a single row as input and return no more than one row are supported.

For example, summarize isn’t supported since it summarizes multiple records. See Supported KQL features for a complete list of supported features.

Therefore you cannot use cross-workspace references like doing lookup in another table.

When the Kusto command is working as expected, then change the tablename to source.

SecurityEvent | where EventID != 5145

was changed to

source | where EventID != 5145

Escape characters in advanced queries

If you need to work with advanced queries, it can be required to adjust the query with escape characters:

SecurityEvent | where (Account != “WINSFTP\\autotest”) and (EventID != 4688 and EventID != 8002 and EventID != 4625) and (Account != “WORKGROUP\\WINSFTP$”)

will be changed to

source\r\n| where (Account != \”CVT-WINSFTP\\\\cvtautotest\”) and (EventID != 4688 and EventID != 8002 and EventID != 4625) and (Account != \”WORKGROUP\\\\CVT-WINSFTP$\”)

Using online converter

In case you are working with advanced queries, I prefer to take my Kusto query and paste it in an online converter, which will convert the query with escape characters.

Personally, I use the online website https://jsonformatter.org/json-escape

Step 2: Deploy a new DCR

Configure a standard DCR rule and choose to collect what you want like for example all security events or all system and application events.

Make a note of the ResourceId of the DCR rule. You will use the ResourceId in step 3


Step 3a: Data transformation using AzLogIngestPS

It is very easy to work with transformations using my powershell module, AzLogIngestPS.

You can read about my powershell module, AzLogIngestPS here

To get started viewing an existing transformation, you will only require the ResourceId, which was noted in the previous step 2.

You can view an existing data transformations (transformKql) using the function Get-AzDataCollectionRuleTransformKql

Get-AzDataCollectionRuleTransformKql  -DcrResourceID <DCR resourceid>

Here you can see value of the transformKql marked in bold (source ….)

PS > Get-AzDataCollectionRuleTransformKql -DcrResourceId /subscriptions/fxxxxx4d8-bf1701b862c3/resourceGroups/rg-logworkspaces/providers/microsoft.insights/dataCollectionRules/dcr-ingest-exclude-security-eventid
source | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662) and (EventID != 4688)

You can add a transformation to an existing DCR using the function Update-AzDataCollectionRuleTransformKql

Update-AzDataCollectionRuleTransformKql -DcrResourceId <DCR-ID> -transformKql  <transform-KQL>

Here you can see the value before and after updating the transformation.

Value before

PS > Get-AzDataCollectionRuleTransformKql -DcrResourceId /subscriptions/fce4f282-fcc6-43fb-94d8-bf1701b862c3/resourceGroups/rg-logworkspaces/providers/microsoft.insights/dataCollectionRules/dcr-ingest-exclude-security-eventid
source | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662) and (EventID != 4688)

Adding the transformation

PS > Update-AzDataCollectionRuleTransformKql -DcrResourceId /subscriptions/xxxxxx-43fb-94d8-bf1701b862c3/resourceGroups/rg-logworkspaces/providers/microsoft.insights/dataCollectionRules/dcr-ingest-exclude-security-eventid -transformKql  "source | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662) and (EventID != 4688) and (EventID != 4663)"

Updating transformKql for DCR
/subscriptions/xxxxxfb-94d8-bf1701b862c3/resourceGroups/rg-logworkspaces/providers/microsoft.insights/dataCollectionRules/dcr-ingest-exclude-security-eventid

Value after – note “and (EventID != 4663)” has been added compared to before value

PS > Get-AzDataCollectionRuleTransformKql -DcrResourceId /subscriptions/xxxxx3fb-94d8-bf1701b862c3/resourceGroups/rg-logworkspaces/providers/microsoft.insights/dataCollectionRules/dcr-ingest-exclude-security-eventid
source | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662) and (EventID != 4688) and (EventID != 4663)

You can install AzLogDcrIngestPS from Powershell gallery

install-module AzLogIngestPS

You need to connect to Azure first with an account with RBAC permissions.

PS > Connect-AzAccount


Step 3b: Adding the TransformKql using REST API and Powershell (alternative method)

As an alternative to using AzLogIngestPS, you can also call REST API and add the transformKql command.

TransformKql requires a REST API call to minimum api-version 2021-09-01-preview. You can see the syntax below.

Currently, the recommended API is 2022-06-01.

You can see available API version in the GUI.


The steps to add a transformation to an existing DCR are:

  1. Retrieve the entire DCR using REST API (GET) in JSON format – and save it to a TXT file
  2. Edit the file – and add the transformKql parameter in the dataFlow section
  3. Upload the entire file content using REST API (PUT)

I have provided a Powershell script on my github to retrieve the DCR into a flat JSON-file so you can edit the file and add the needed transformation – and then upload the modified DCR again.

Below I have created a sample folder C:\TMP where the file will be stored

Start by setting the variables $ResourceId and $FilePath

ResourceId was noted in the previous step 2

FilePath can be any path – it is only a temporary file used for this change as example

####################################################
# VARIABLES
####################################################

# here you put the ResourceID of the Data Collection Rules (a sample is provided below)
$ResourceId = "/subscriptions/xxxxxx/resourceGroups/rg-logworkspaces/providers/microsoft.insights/dataCollectionRules/dcr-ingest-exclude-security-eventid"
    
# here you put a path and file name where you want to store the temporary file-extract from DCR (a sample is provided below)
$FilePath   = "c:\tmp\dcr-ingest-exclude-security-eventid.txt"

Connect to Azure

####################################################
# CONNECT TO AZURE
####################################################

Connect-AzAccount

Run the export DCR

$DCR = Invoke-AzRestMethod -Path ("$ResourceId"+"?api-version=2022-06-01") -Method GET

$DCR.Content | ConvertFrom-Json | ConvertTo-Json -Depth 20 | Out-File -FilePath $FilePath

Now you have a JSON file in c:\tmp folder

Modify file and add TransformKql

Open the file using your favorite editor and add the line transformKql command that you created in step 1

"transformKql":  "source\n| where (EventID != 5145)",

NOTE: Remember to add the , (comma) at the end of the line so you are not breaking the JSON syntax.

"dataFlows": [
            {
                "streams": [
                    "Microsoft-SecurityEvent"
                ],
                "destinations": [
                    "DataCollectionEvent"
                ],
                "transformKql": "source | where (EventID != 8002) and (EventID != 4688) and (EventID != 4663)",
                "outputStream": "Microsoft-SecurityEvent"
            }
        ],

Upload the modified DCR (overwrite)

Now you want to run the last part of the powershell script, which will update the DCR taking the entire content of the local file and making PUT REST call against the specific api version 2022-06-01.

####################################################
# UPLOAD FILE / UPDATE DCR WITH TRANSFORM
####################################################

$DCRContent = Get-Content $FilePath -Raw 

Invoke-AzRestMethod -Path ("$ResourceId"+"?api-version=2022-06-01") -Method PUT -Payload $DCRContent

You should be getting a StatusCode 200 with the PUT commmand, indicating everything it updated correctly

If there is is an error in the file structure, you will get an error 400


Step 4: Verification of changes in DCR (optional)

You can choose to check in the GUI for the change. Remember to choose the API-version 2022-06-01 (or 2021-09-01-preview). Otherwise you wont be able to see the change.

You will now see the transformKql.

You can also extract the changes running the first lines again to extract into the local file using the GET command

####################################################
# EXPORT EXISTING DCR TO FILE
####################################################

$DCR = Invoke-AzRestMethod -Path ("$ResourceId"+"?api-version=2022-06-01") -Method GET

$DCR.Content | ConvertFrom-Json | ConvertTo-Json -Depth 20 | Out-File -FilePath $FilePath


Step 5: Associate the DCR rule to the machine(s)

Lastly you have to associate the DCR rule to the machine(s) where you want the transformation to happen.

You have to wait for pipeline transformation to happen. Normally it will start within 5 minutes, sometimes a little longer.


Real-life examples of effect of transformations

Below you can find some examples and their effect of using transformations – with focus on cost reductions.

As you can see, the results were a significant decrease.

Of course there is a trade-off, which must be considered. There are always 2 sides of the coin – will I miss the data at some point !

Example 1 – removing specific SecurityEvent (5145) from a particular server

The effect in cost – here shown in DKK, where the daily cost drops to DKK 2300 from DKK 5000

Example 2 – removing syslog traffic

Here is an example where I did a transformation removing Syslog events with specific patterns.

source
| where (DeviceVendor !contains "sonicwall") or ((DeviceVendor contains "sonicwall") and (Activity contains "connection opened" or Activity contains "connection closed") and (Protocol != "udp/dns"))

Leave a Reply