Understanding the fundamentals of log-collection with Azure Monitor Agent & Azure Data Collection Rules

This blog will take you “under the hood” of extensions, Azure Monitor Agent (AMA) and Azure Data Collection Rules for AMA.

This blog-post is part of a series of blog posts to master Azure logging in depth (overview).

To get you started, you can find ARM-templates & scripts in my AzureLogLibrary (github). Details will be covered in the articles.


Quick links

Understanding AMA “under the hood
Sources of data
Data Collection Rule structure
Where does Extensions store its data ?
Supported ways of doing data-transformation with AMA
Transformation using XPath
Tutorial – How to make data transformations using Data Collection Rules?
How to transform your data using TransformKql in a DCR using AMA
Deploying Azure Monitor Agent using Azure Policy
Deploying Azure Monitor Agent using Powershell to Azure Arc servers
Deploying Azure Monitor Agent using Powershell to Azure VMs
Troubleshooting DCR
Azure Monitor Agent troubleshooting


Understanding AMA “under the hood”

Azure Monitor Agent (AMA) is the replacement for the legacy agent, Microsoft Monitoring Agent (MMA).

AMA sends the data to Azure Data Ingestion Pipeline through a Data Collection Rule.

Inside Azure, Azure Monitor Agent is considered an extension – new word for “agent”. Azure has multiple extensions like GuestAgent, DependencyAgent,MDE, etc.

When you go “under the hood”, all extensions are installed in C:\Packages\Plugins

When looking at the platform, you will notice that the AMA extension runs as a process – and not as a Windows service.

The main reason for running as a process is, that it gives Microsoft better opportunities to build logics that runs as child-processes under the main process (AMAExtHealthMonitor). Earlier Microsoft was dependent on updating the MMA agent, whenever they wanted to deliver new functionality.

Child processes in AMA can be collection of performance data talking with another extension or collection of security events. Each child-process is managed by a DCR – and will be downloaded (and updated) every 5 min (if changed).

AMA by itself cannot do anything – it is dependent on getting the definition of what to collect from a Data Collection Rule (DCR). I have prepared a library of deployment templates to get you started collecting data using Data Collection Rules

In the next sections, you can read about what data sources are available – and the structure of a DCR when used together with AMA.

When you have deployed your DCRs, you will need to associate a Data Collection Rule with a VM.


Sources of data

You can retrieve data from many sources. Please see the list of data sources in the blog-post


Data Collection Rule structure

Data Collection Rules (DCRs) determine how to collect and process telemetry sent to Azure.

The structure differs depending on they are sent to custom logs or Azure Monitor Agent.

A DCR for Azure Monitor Agent contains the following sections:

dataSourcesThis unique source of monitoring data has its own format and method of exposing its data. Examples of a data source include Windows event log, performance counters, and Syslog. Each data source matches a particular data source type as described in the following table.

Each data source has a data source type. Each type defines a unique set of properties that must be specified for each data source. The data source types currently available appear below:

extension
VM extension-based data source, used exclusively by Log Analytics solutions and Azure services (View agent supported services and solutions)

performanceCounters
Performance counters for both Windows and Linux

syslog
Syslog events on Linux

windowsEventLogs
Windows event log
streamsThis unique handle describes a set of data sources that will be transformed and schematized as one type.

Each data source requires one or more streams, and one stream can be used by multiple data sources.

All data sources in a stream share a common schema. Use multiple streams, for example, when you want to send a particular data source to multiple tables in the same Log Analytics workspace
destinationsThis section contains a declaration of all the destinations where the data will be sent.

Multiple destinations are allowed for multi-homing scenarios Each Log

Analytics destination requires the full workspace resource ID and a friendly name that will be used elsewhere in the DCR to refer to this workspace.
dataFlowsThis section ties the other sections together. It defines the following properties for each stream declared in the streamDeclarations section:

destination from the destinations section where the data will be sent.

transformKql section, which is the transformation applied to the data that was sent in the input shape described in the streamDeclarations section to the shape of the target table.

outputStream section, which describes which table in the workspace specified under the destination property the data will be ingested into.  

The value of outputStream has the Microsoft-[tableName] shape when data is being ingested into a standard Log Analytics table, or Custom-[tableName] when ingesting data into a custom-created table. Only one destination is allowed per stream.
{
    "properties": {
        "immutableId": "dcr-xxxxxxdc23b41e938b",
        "dataSources": {
            "windowsEventLogs": [
                {
                    "streams": [
                        "Microsoft-SecurityEvent"
                    ],
                    "xPathQueries": [
                        "Security!*",
                        "Microsoft-Windows-AppLocker/EXE and DLL!*",
                        "Microsoft-Windows-AppLocker/MSI and Script!*"
                    ],
                    "name": "eventLogsDataSource"
                }
            ]
        },
        "destinations": {
            "logAnalytics": [
                {
                    "workspaceResourceId": "/subscriptions/xxxxxxx-bf1701b862c3/resourceGroups/rg-logworkspaces/providers/Microsoft.OperationalInsights/workspaces/log-platform-management-srvnetworkcloud-p",
                    "workspaceId": "xxxxxxxxx9889301c7114",
                    "name": "DataCollectionEvent"
                }
            ]
        },
        "dataFlows": [
            {
                "streams": [
                    "Microsoft-SecurityEvent"
                ],
                "destinations": [
                    "DataCollectionEvent"
                ],
                "transformKql": "source | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662) and (EventID != 4688)",
                "outputStream": "Microsoft-SecurityEvent"
            }
        ],
        "provisioningState": "Succeeded"
    },
    "location": "westeurope",
    "tags": {
        "createdBy": "Sentinel"
    },
    "kind": "Windows",
    "id": "/subscriptions/xxxxxxxxbf1701b862c3/resourceGroups/rg-logworkspaces/providers/Microsoft.Insights/dataCollectionRules/dcr-ingest-exclude-security-eventid",
    "name": "dcr-ingest-exclude-security-eventid",
    "type": "Microsoft.Insights/dataCollectionRules",
    "etag": "\"xxxxxx-0000-640e2bff0000\"",
    "systemData": {
        "createdBy": "mok@2linkit.net",
        "createdByType": "User",
        "createdAt": "2022-12-23T07:56:19.0759102Z",
        "lastModifiedBy": "mok@2linkit.net",
        "lastModifiedByType": "User",
        "lastModifiedAt": "2023-03-12T19:46:06.6129653Z"
    }
}

Where does Extensions store its data ?

Extensions – including AMA – stores its data in C:\WindowsAzure

AMA keeps its data here

If you want to see the details of what is going on, you can run the AMA troubleshooting tool, found on the endpoint (CollectAMALogs.ps1). Please see in more details later in this blog.

Supported ways of doing data-transformation with AMA

Currently, there are support for doing data-transformation using 2 methods when using AMA on endpoints:

  1. XPath filtering using Azure Monitor Agent
  2. Azure Monitor Agent (AMA) using TransformKql in Data Collection Rule (DCR)

Before going into configurering data-transformation, I want to touch on the ”old” method using XPath (XML Path Language).

Transformation using XPath

Since AMA was launched in June 2021, it has supported doing transformation with XPath. This methods works fine and has helped my customers to save money.

Microsoft asked me to provide a statement about this in the officiel go-live announcement, when Azure Monitor Agent went GA in June 2021.

Here is an example of using XPath where I am collecting all security events, excluding any computer-related security-events like Computer logon-traffic from specific exchange servers.

Security!*[System[(band(Keywords,13510798882111488))]] and [(EventData[Data[@Name='TargetUserName'] != 'EXCHANGE01$'])] and [(EventData[Data[@Name='TargetUserName'] != 'EXCHANGE02$'])] and [(EventData[Data[@Name='TargetUserName'] != 'EXCHANGE03$'])] and [(EventData[Data[@Name='TargetUserName'] != 'EXCHANGE04$'])] and [(EventData[Data[@Name='TargetUserName'] != 'EXCHANGE05$'])] and [(EventData[Data[@Name='SubjectUserName'] != 'domain\\EXCHANGE01$'])] and [(EventData[Data[@Name='SubjectUserName'] != 'domain\\EXCHANGE02$'])] and [(EventData[Data[@Name='SubjectUserName'] != 'domain\\EXCHANGE03$'])] and [(EventData[Data[@Name='SubjectUserName'] != 'domain\\EXCHANGE04$'])] and [(EventData[Data[@Name='SubjectUserName'] != 'domain\\EXCHANGE05$'])] and [(EventData[Data[@Name='SubjectUserName'] != 'domain.DK\\EXCHANGE01$'])] and [(EventData[Data[@Name='SubjectUserName'] != 'domain.DK\\EXCHANGE02$'])] and [(EventData[Data[@Name='SubjectUserName'] != 'domain.DK\\EXCHANGE03$'])] and [(EventData[Data[@Name='SubjectUserName'] != 'domain.DK\\EXCHANGE04$'])] and [(EventData[Data[@Name='SubjectUserName'] != 'domain.DK\\EXCHANGE05$'])]

AMA is still supporting XPath, but I recommend to use data transformation using DCR-rules, as it solves some of the limitations in the XPath implementation in AMA.

XPath is a standard, currently at 3.1 (2017), but Microsoft chose to implement XPath v1 in AMA from 1999. V1 doesn’t support filtering with like or contains. Therefore it is hard to scale using xpath v1, if you need advanced features.

Transformation with Azure Monitor Agent (AMA) & Azure Data Collection Rules (DCR)

I recommend doing transformation by using a DCR and use the transformKql command. When you use this method, all transformations happen in the pipeline before data is being sent into Azure LogAnalytics.


Tutorial – How to make data transformations using Data Collection Rules?

The below text is embedded from another blog post

This section will show you the steps for setting up data transformations – and how you can do the transformation using AzLogDcrIngestPS (my new Powershell module) – or using REST API commands in Powershell

If you want to read about transformation in depth, you can find more details here

You can also get inspired of real-life samples (and their effects) of using transformation to reduce costs.

Quick Links

Data transformation architecture
High-level steps to do data transformation
Step 1: Kusto command must be defined
Escape characters in advanced queries
Step 2: Deploy a new DCR
Step 3a: Data transformation using AzLogIngestPS
Step 3b: Adding the TransformKql using REST API and Powershell (alternative method)
Step 4: Verification of changes in DCR (optional)
Step 5: Associate the DCR rule to the machine(s)
Real-life examples of effect of transformations
Example 1 – removing specific SecurityEvent (5145) from a particular server
Example 2 – removing syslog traffic

Data transformation architecture

Data transformations runs in the Azure Data Ingestion Pipeline and happens very fast as data is being uploaded.

Data transformations are defined in the transformKql property in the DCR section dataFlows.

dataFlows": [
            {
                "streams": [
                    "Microsoft-SecurityEvent"
                ],
                "destinations": [
                    "DataCollectionEvent"
                ],
                "transformKql": "source | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662) ",
                "outputStream": "Microsoft-SecurityEvent"
            }

High-level steps to do data transformation

The 5 steps to add a data transformations are:

  1. Kusto command must be defined to understand the syntax to exclude the data
  2. Deploy a new DCR
  3. Add the transformKql to the DCR – I am covering 2 methods for this in the article (AzLogIngestPS and REST API)
  4. Verification of changes in DCR (optional)
  5. Assign the DCR to the machine(s) where the transformation must happen

In the example below, I am doing a transformation to remove data. Another example is to add new columns, based on incoming data.

I hope the below example gives you the insight to understand how to work with transformation.

Step 1: Kusto command must be defined

Transformations defines which data should be sent through the pipeline.

If you want to exclude specific events, you will instruct the transformation to exclude these events using standard Kusto – except you will refer to the tablename as source

source where (EventID != 4662)

Below are 4 samples

SecurityEvent | where EventID != 5145Here I want to see all Security event except for EventID = 5145
SecurityEvent | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662)Here I want to see all Security events except for EventID = 8002,5058,4662
Event | where ( (EventID != 10016 and EventLog == “Application”)  )Here I want to see all Event system and application events, except for application events with eventid 10016
CommonSecurityLog | where (DeviceVendor !contains “sonicwall”) or ((DeviceVendor contains “sonicwall”) and (Activity contains “connection opened” or Activity contains “connection closed”) and (Protocol != “udp/dns”))Here I want to see all CEF/syslog where devicevendor is different from sonicwall like Cisco and all sonicwall events, except if protocol is udp/dns

Since the transformation is applied to each record individually, it can’t use any KQL operators that act on multiple records. Only operators that take a single row as input and return no more than one row are supported.

For example, summarize isn’t supported since it summarizes multiple records. See Supported KQL features for a complete list of supported features.

Therefore you cannot use cross-workspace references like doing lookup in another table.

When the Kusto command is working as expected, then change the tablename to source.

SecurityEvent | where EventID != 5145

was changed to

source | where EventID != 5145

Escape characters in advanced queries

If you need to work with advanced queries, it can be required to adjust the query with escape characters:

SecurityEvent | where (Account != “WINSFTP\\autotest”) and (EventID != 4688 and EventID != 8002 and EventID != 4625) and (Account != “WORKGROUP\\WINSFTP$”)

will be changed to

source\r\n| where (Account != \”CVT-WINSFTP\\\\cvtautotest\”) and (EventID != 4688 and EventID != 8002 and EventID != 4625) and (Account != \”WORKGROUP\\\\CVT-WINSFTP$\”)

Using online converter

In case you are working with advanced queries, I prefer to take my Kusto query and paste it in an online converter, which will convert the query with escape characters.

Personally, I use the online website https://jsonformatter.org/json-escape

Step 2: Deploy a new DCR

Configure a standard DCR rule and choose to collect what you want like for example all security events or all system and application events.

Make a note of the ResourceId of the DCR rule. You will use the ResourceId in step 3


Step 3a: Data transformation using AzLogIngestPS

It is very easy to work with transformations using my powershell module, AzLogIngestPS.

You can read about my powershell module, AzLogIngestPS here

To get started viewing an existing transformation, you will only require the ResourceId, which was noted in the previous step 2.

You can view an existing data transformations (transformKql) using the function Get-AzDataCollectionRuleTransformKql

Get-AzDataCollectionRuleTransformKql  -DcrResourceID <DCR resourceid>

Here you can see value of the transformKql marked in bold (source ….)

PS > Get-AzDataCollectionRuleTransformKql -DcrResourceId /subscriptions/fxxxxx4d8-bf1701b862c3/resourceGroups/rg-logworkspaces/providers/microsoft.insights/dataCollectionRules/dcr-ingest-exclude-security-eventid
source | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662) and (EventID != 4688)

You can add a transformation to an existing DCR using the function Update-AzDataCollectionRuleTransformKql

Update-AzDataCollectionRuleTransformKql -DcrResourceId <DCR-ID> -transformKql  <transform-KQL>

Here you can see the value before and after updating the transformation.

Value before

PS > Get-AzDataCollectionRuleTransformKql -DcrResourceId /subscriptions/fce4f282-fcc6-43fb-94d8-bf1701b862c3/resourceGroups/rg-logworkspaces/providers/microsoft.insights/dataCollectionRules/dcr-ingest-exclude-security-eventid
source | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662) and (EventID != 4688)

Adding the transformation

PS > Update-AzDataCollectionRuleTransformKql -DcrResourceId /subscriptions/xxxxxx-43fb-94d8-bf1701b862c3/resourceGroups/rg-logworkspaces/providers/microsoft.insights/dataCollectionRules/dcr-ingest-exclude-security-eventid -transformKql  "source | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662) and (EventID != 4688) and (EventID != 4663)"

Updating transformKql for DCR
/subscriptions/xxxxxfb-94d8-bf1701b862c3/resourceGroups/rg-logworkspaces/providers/microsoft.insights/dataCollectionRules/dcr-ingest-exclude-security-eventid

Value after – note “and (EventID != 4663)” has been added compared to before value

PS > Get-AzDataCollectionRuleTransformKql -DcrResourceId /subscriptions/xxxxx3fb-94d8-bf1701b862c3/resourceGroups/rg-logworkspaces/providers/microsoft.insights/dataCollectionRules/dcr-ingest-exclude-security-eventid
source | where (EventID != 8002) and (EventID != 5058) and (EventID != 4662) and (EventID != 4688) and (EventID != 4663)

You can install AzLogDcrIngestPS from Powershell gallery

install-module AzLogIngestPS

You need to connect to Azure first with an account with RBAC permissions.

PS > Connect-AzAccount


Step 3b: Adding the TransformKql using REST API and Powershell (alternative method)

As an alternative to using AzLogIngestPS, you can also call REST API and add the transformKql command.

TransformKql requires a REST API call to minimum api-version 2021-09-01-preview. You can see the syntax below.

Currently, the recommended API is 2022-06-01.

You can see available API version in the GUI.


The steps to add a transformation to an existing DCR are:

  1. Retrieve the entire DCR using REST API (GET) in JSON format – and save it to a TXT file
  2. Edit the file – and add the transformKql parameter in the dataFlow section
  3. Upload the entire file content using REST API (PUT)

I have provided a Powershell script on my github to retrieve the DCR into a flat JSON-file so you can edit the file and add the needed transformation – and then upload the modified DCR again.

Below I have created a sample folder C:\TMP where the file will be stored

Start by setting the variables $ResourceId and $FilePath

ResourceId was noted in the previous step 2

FilePath can be any path – it is only a temporary file used for this change as example

####################################################
# VARIABLES
####################################################

# here you put the ResourceID of the Data Collection Rules (a sample is provided below)
$ResourceId = "/subscriptions/xxxxxx/resourceGroups/rg-logworkspaces/providers/microsoft.insights/dataCollectionRules/dcr-ingest-exclude-security-eventid"
    
# here you put a path and file name where you want to store the temporary file-extract from DCR (a sample is provided below)
$FilePath   = "c:\tmp\dcr-ingest-exclude-security-eventid.txt"

Connect to Azure

####################################################
# CONNECT TO AZURE
####################################################

Connect-AzAccount

Run the export DCR

$DCR = Invoke-AzRestMethod -Path ("$ResourceId"+"?api-version=2022-06-01") -Method GET

$DCR.Content | ConvertFrom-Json | ConvertTo-Json -Depth 20 | Out-File -FilePath $FilePath

Now you have a JSON file in c:\tmp folder

Modify file and add TransformKql

Open the file using your favorite editor and add the line transformKql command that you created in step 1

"transformKql":  "source\n| where (EventID != 5145)",

NOTE: Remember to add the , (comma) at the end of the line so you are not breaking the JSON syntax.

"dataFlows": [
            {
                "streams": [
                    "Microsoft-SecurityEvent"
                ],
                "destinations": [
                    "DataCollectionEvent"
                ],
                "transformKql": "source | where (EventID != 8002) and (EventID != 4688) and (EventID != 4663)",
                "outputStream": "Microsoft-SecurityEvent"
            }
        ],

Upload the modified DCR (overwrite)

Now you want to run the last part of the powershell script, which will update the DCR taking the entire content of the local file and making PUT REST call against the specific api version 2022-06-01.

####################################################
# UPLOAD FILE / UPDATE DCR WITH TRANSFORM
####################################################

$DCRContent = Get-Content $FilePath -Raw 

Invoke-AzRestMethod -Path ("$ResourceId"+"?api-version=2022-06-01") -Method PUT -Payload $DCRContent

You should be getting a StatusCode 200 with the PUT commmand, indicating everything it updated correctly

If there is is an error in the file structure, you will get an error 400


Step 4: Verification of changes in DCR (optional)

You can choose to check in the GUI for the change. Remember to choose the API-version 2022-06-01 (or 2021-09-01-preview). Otherwise you wont be able to see the change.

You will now see the transformKql.

You can also extract the changes running the first lines again to extract into the local file using the GET command

####################################################
# EXPORT EXISTING DCR TO FILE
####################################################

$DCR = Invoke-AzRestMethod -Path ("$ResourceId"+"?api-version=2022-06-01") -Method GET

$DCR.Content | ConvertFrom-Json | ConvertTo-Json -Depth 20 | Out-File -FilePath $FilePath


Step 5: Associate the DCR rule to the machine(s)

Lastly you have to associate the DCR rule to the machine(s) where you want the transformation to happen.

You have to wait for pipeline transformation to happen. Normally it will start within 5 minutes, sometimes a little longer.


Real-life examples of effect of transformations

Below you can find some examples and their effect of using transformations – with focus on cost reductions.

As you can see, the results were a significant decrease.

Of course there is a trade-off, which must be considered. There are always 2 sides of the coin – will I miss the data at some point !

Example 1 – removing specific SecurityEvent (5145) from a particular server

The effect in cost – here shown in DKK, where the daily cost drops to DKK 2300 from DKK 5000

Example 2 – removing syslog traffic

Here is an example where I did a transformation removing Syslog events with specific patterns.

source
| where (DeviceVendor !contains "sonicwall") or ((DeviceVendor contains "sonicwall") and (Activity contains "connection opened" or Activity contains "connection closed") and (Protocol != "udp/dns"))


Deploying Azure Monitor Agent using Azure Policy

You need to assign the following policy definitions to the scope of servers:

  • Configure Windows virtual machines to run Azure Monitor Agent using system-assigned managed identity
  • Configure Windows virtual machine scale sets to run Azure Monitor Agent using system-assigned managed identity
  • Configure Windows Arc-enabled machines to run Azure Monitor Agent

The policy automatically set a system-managed identity on the VMs/computers, which is needed.

If you deploy the policy on an existing set of management group / subscription, you need to make a policy remediation to force the extensions to be applied to existing Azure resources.

After deploying the Extension using policy, you will need to associate the DCRs with the Azure resources, where you want to collect the data from.

You can also configure the DCRs using policy by assigning the following policy to the scope:

  • Configure Windows Machines to be associated with a Data Collection Rule or a Data Collection Endpoint


Personally I prefer to associate DCRs with servers using automation, where I can control the targeting, do exceptions, etc. For example, I want to control whether the extension should connect directly – or through a proxy. I do this based on where there are placed (IP subnets, DMZ)

Deploying Azure Monitor Agent using Powershell to Azure Arc servers

You can use the Az.ConnectedMachine module to deploy AMA extension using Powershell on Azure Arc connected machines.

Set-AzConnectedMachineExtension -Name "AzureMonitorWindowsAgent" `
                                -ResourceGroupName "<resource group for VM>" `
                                -MachineName "<vm name>" `
                                -Publisher "Microsoft.Azure.Monitor" `
                                -TypeHandlerVersion 1.13 `
                                -ExtensionType "AzureMonitorWindowsAgent" `
                                -Location "<Azure region>" `
                                -NoWait

Example

Set-AzConnectedMachineExtension -Name "AzureMonitorWindowsAgent" `
                                -ResourceGroupName "rg-azurearc-servers-denmark" `
                                -MachineName "DC6" `
                                -Publisher "Microsoft.Azure.Monitor" `
                                -TypeHandlerVersion 1.13 `
                                -ExtensionType "AzureMonitorWindowsAgent" `
                                -Location "westeurope" `
                                -NoWait

Deploying Azure Monitor Agent using Powershell to Azure VMs

You can use the Az.Compute module to deploy AMA extension using Powershell on native Azure VMs.

Set-AzVMExtension -ExtensionName "AzureMonitorWindowsAgent" `
                  -ResourceGroupName "<resource group for VM>" `
                  -VMName "<vm name>" `
                  -Publisher "Microsoft.Azure.Monitor" `
                  -TypeHandlerVersion 1.13 `
                  -ExtensionType "AzureMonitorWindowsAgent" `
                  -Location "<Azure region>" `
                  -EnableAutomaticUpgrade $true `
                  -NoWait

Example

Set-AzVMExtension -ExtensionName "AzureMonitorWindowsAgent" `
                  -ResourceGroupName "RG-PRODUCTION-WESTEUROPE" `
                  -VMName "DC3" `
                  -Publisher "Microsoft.Azure.Monitor" `
                  -TypeHandlerVersion 1.13 `
                  -ExtensionType "AzureMonitorWindowsAgent" `
                  -Location "westeurope" `
                  -EnableAutomaticUpgrade $true `
                  -NoWait

Troubleshooting DCR

If you want to verify, that the local Azure Monitor Agent has received the DCR rule, this can be done using the local AMA files. AMA will check every 5 min for new DCR changes.

Verification using local AMA config files

Open your DCR rule in JSON view and note the ImmuntableId of your DCR rule

Open the file 

C:\WindowsAzure\Resources\AMADataStore.DC1\mcs\mcsconfig.latest.xml 

(or mcsconfig.latest.json).

Search for the ImmuntableId. You should now see that it is using the new DCR rule.

If AMA is not receiving the DCR rule, please verify that you have linked the DCR rule to the server. Otherwise continue to the AMA troubleshooting below.

Azure Monitor Agent troubleshooting

Azure Monitor Agent has a troubleshooting Powershell-script, which might be able to help you.

Run the following Powershell script:

C:\Packages\Plugins\Microsoft.Azure.Monitor.AzureMonitorWindowsAgent\<latest version on your computer>\CollectAMALogs.ps1

It will generate a zip-file on your desktop with all the log-files.

Open the file maeventtable.csv in Excel and filter

  • Stream = MonAgentManager.exe
  • File = refreshconfigurations.cpp

You should now be able to see the DCR changes

8 thoughts on “Understanding the fundamentals of log-collection with Azure Monitor Agent & Azure Data Collection Rules”

  1. Wow great article.
    What about collection of dns logs from a windows server?
    Can you use the default DCR configuration or do you need an xpath?
    How can I figure out the xpath for the dns logs?
    Cheers.

    Reply
  2. Really good post. What happens in the case where a Windows server loses network connectivity? Does the AMA buffer windows logs or keep track of what has been sent and then recover (within the constraints of a limited time window) logs that would have been sent if not for the connection issue?

    Reply
  3. Hi Morten. Fantastic info, thank you. Can you see a way to prevent LAW from ingesting perf data from AMAs? There is a perf -Azure table within each LAW and it cannot be removed or altered. By default perf data is the main source of cost in my use case. I have other means to monitor my infra and would like to get rid of the unnecessary perf data ingestion. Thanks a lot!

    Reply
  4. Hi Morten,

    Really appreciate the info around AMA.

    QQ, when creating a new DCR, would you recommend two separate DCRs for each platform or is it OK to select All for both Windows and Linux?

    Thanks in advance!

    Reply
  5. Amazing documentation. I am reading everything i find about the collection of data using the new ama agent and the DCRs. I am facing the issue of having to configure DCRs for machines on different tennants and having to send the logs to a single LAW. The old way was quite easy to configure since i only had to give the ID of the tenant to the OMS config but now with the DCRs im wondering if there is any easy way to tell the DCR in its config to connect to an specific LAW in a different tennant. I have found solutions involving azure lighthouse and configuring cross-tenannt data access. Is there an easier way to do this like the older one?
    Thanks in advance, have a nice day.

    Reply

Leave a Reply