If you are sending data using HTTP Data Collector API (REST) today, you should continue reading, as this API will be deprecated, as part of the transition to Log ingestion API using Azure Data Collection Rules, Azure Pipeline, Azure LogAnalytics custom tables (v2).
As you can see from the illustration more components (DCR, DCR, Pipeline, Schema) are added, which also increases the complexity.
I have built a Powershell module, AzLogDcrIngestPS which will ease the steps, if you want to send any data to Azure LogAnalytics custom logs (v2) – using the new features of Azure Log Ingestion Pipeline, Azure Data Colection Rules & Log Ingestion API.
AzLogDcrIngestPS includes 25 functions dealing with:
- data manipulation before sending data in (7 functions)
- table / dcr / schema / transformation management (13 functions)
- data upload using Azure Log Ingestion Pipeline / Log Ingestion API (4 functions)
- support/security (1 function)
This blog post will cover tips & troubleshooting options using functions in AzLogDcrIngestPS so data can be uploaded into Azure LogAnalytics using Log Ingestion API.
Error: The location property is required for this definition
If you receive an error similar to the below error, this is caused by the script cannot find the DceName used. Check for spelling mistakes
invoke-webrequest : {"error":{"code":"LocationRequired","message":"The location property is required for this definition."}}
At C:\Program Files\WindowsPowerShell\Modules\AzLogDcrIngestPS\1.2.37\AzLogDcrIngestPS.psm1:1635 char:21
+ ... invoke-webrequest -UseBasicParsing -Uri $Uri -Method PUT ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (System.Net.HttpWebRequest:HttpWebRequest) [Invoke-WebRequest], WebException
+ FullyQualifiedErrorId : WebCmdletWebResponseException,Microsoft.PowerShell.Commands.InvokeWebRequestCommand
Error 513 – entity is too large – entity is too large
By default the Post functions will send the data in batches depending on an calculated average size per record. In case your recordset is of different size, you might receive an error 513.
Cause is that you are hitting the limitation of 1 mb for each upload (Azure Pipeline limitation). Microsoft wants to receive many smaller chunks of data, as this is a shared environment. I have seen this issue when retrieving the list of all installed applications. Apparently the applications are storing information of very different degree of size.
You can mitigate this issue, by adding the parameter -BatchAmount to the Post-command. If you want to be sure, set it to 1
Post-AzLogAnalyticsLogIngestCustomLogDcrDce-Output -DceName $DceName `
-DcrName $DcrName `
-Data $DataVariable `
-TableName $TableName `
-AzAppId $LogIngestAppId `
-AzAppSecret $LogIngestAppSecret `
-TenantId $TenantId `
-BatchAmount 1 `
-Verbose:$Verbose
Verbose-mode
If you want to get more detailed information about that is happening, you can enable verbose mode (-verbose:$true)
Here is an example with ClientInspector, where verbose mode is activated.
.\ClientInspector.ps1 -verbose:$true -function:localpath
Here is an example where the function is called in verbose-mode
Build-DataArrayToAlignWithSchema -Data $DataVariable -Verbose:$true
How can I see which cmdlets are available in the module?
If you want to get help with the syntax and examples from the AzLogDcrLogIngestPS module, you can write get-module
PS get-command -module AzLogDcrIngestPS
CommandType Name Version Source
----------- ---- ------- ------
Function Add-CollectionTimeToAllEntriesInArray 1.1.17 AzLogDcrIngestPS
Function Add-ColumnDataToAllEntriesInArray 1.1.17 AzLogDcrIngestPS
Function Build-DataArrayToAlignWithSchema 1.1.17 AzLogDcrIngestPS
Function CheckCreateUpdate-TableDcr-Structure 1.1.17 AzLogDcrIngestPS
Function Convert-CimArrayToObjectFixStructure 1.1.17 AzLogDcrIngestPS
Function Convert-PSArrayToObjectFixStructure 1.1.17 AzLogDcrIngestPS
Function CreateUpdate-AzDataCollectionRuleLogIngestCusto... 1.1.17 AzLogDcrIngestPS
Function CreateUpdate-AzLogAnalyticsCustomLogTableDcr 1.1.17 AzLogDcrIngestPS
Function Delete-AzDataCollectionRules 1.1.17 AzLogDcrIngestPS
Function Delete-AzLogAnalyticsCustomLogTables 1.1.17 AzLogDcrIngestPS
Function Filter-ObjectExcludeProperty 1.1.17 AzLogDcrIngestPS
Function Get-AzAccessTokenManagement 1.1.17 AzLogDcrIngestPS
Function Get-AzDceListAll 1.1.17 AzLogDcrIngestPS
Function Get-AzDcrDceDetails 1.1.17 AzLogDcrIngestPS
Function Get-AzDataCollectionRuleTransformKql 1.1.17 AzLogDcrIngestPS
Function Get-AzDcrListAll 1.1.17 AzLogDcrIngestPS
Function Get-AzLogAnalyticsTableAzDataCollectionRuleStatus 1.1.17 AzLogDcrIngestPS
Function Get-ObjectSchemaAsArray 1.1.17 AzLogDcrIngestPS
Function Get-ObjectSchemaAsHash 1.1.17 AzLogDcrIngestPS
Function Post-AzLogAnalyticsLogIngestCustomLogDcrDce 1.1.17 AzLogDcrIngestPS
Function Post-AzLogAnalyticsLogIngestCustomLogDcrDce-Output 1.1.17 AzLogDcrIngestPS
Function Update-AzDataCollectionRuleDceEndpoint 1.1.17 AzLogDcrIngestPS
Function Update-AzDataCollectionRuleResetTransformKqlDef... 1.1.17 AzLogDcrIngestPS
Function Update-AzDataCollectionRuleTransformKql 1.1.17 AzLogDcrIngestPS
Function ValidateFix-AzLogAnalyticsTableSchemaColumnNames 1.1.17 AzLogDcrIngestPS
How can I get access to the help, parameters, syntax, examples – using get-help ?
Get help with a specific cmdlet – get-help Add-CollectionTimeToAllEntriesInArray -full
get-help Add-CollectionTimeToAllEntriesInArray -full
NAME
Add-CollectionTimeToAllEntriesInArray
SYNOPSIS
Add property CollectionTime (based on current time) to all entries on the object
SYNTAX
Add-CollectionTimeToAllEntriesInArray [-Data] <Array> [<CommonParameters>]
DESCRIPTION
Gives capability to do proper searching in queries to find latest set of records with same collection time
Time Generated cannot be used when you are sending data in batches, as TimeGenerated will change
An example where this is important is a complete list of applications for a computer. We want all applications to
show up when queriying for the latest data
PARAMETERS
-Data <Array>
Object to modify
Required? true
Position? 1
Default value
Accept pipeline input? false
Accept wildcard characters? false
<CommonParameters>
This cmdlet supports the common parameters: Verbose, Debug,
ErrorAction, ErrorVariable, WarningAction, WarningVariable,
OutBuffer, PipelineVariable, and OutVariable. For more information, see
about_CommonParameters (https:/go.microsoft.com/fwlink/?LinkID=113216).
INPUTS
None. You cannot pipe objects
OUTPUTS
Updated object with CollectionTime
-------------------------- EXAMPLE 1 --------------------------
PS C:\>#-------------------------------------------------------------------------------------------
# Variables
#-------------------------------------------------------------------------------------------
$Verbose = $true # $true or $false
#-------------------------------------------------------------------------------------------
# Collecting data (in)
#-------------------------------------------------------------------------------------------
$DNSName = (Get-CimInstance win32_computersystem).DNSHostName +"." + (Get-CimInstance win32_computersystem).Domain
$ComputerName = (Get-CimInstance win32_computersystem).DNSHostName
[datetime]$CollectionTime = ( Get-date ([datetime]::Now.ToUniversalTime()) -format "yyyy-MM-ddTHH:mm:ssK" )
$UserLoggedOnRaw = Get-Process -IncludeUserName -Name explorer | Select-Object UserName -Unique
$UserLoggedOn = $UserLoggedOnRaw.UserName
$DataVariable = Get-CimInstance -ClassName Win32_Processor | Select-Object -ExcludeProperty "CIM*"
#-------------------------------------------------------------------------------------------
# Preparing data structure
#-------------------------------------------------------------------------------------------
$DataVariable = Convert-CimArrayToObjectFixStructure -data $DataVariable -Verbose:$Verbose
$DataVariable
# add CollectionTime to existing array
$DataVariable = Add-CollectionTimeToAllEntriesInArray -Data $DataVariable -Verbose:$Verbose
$DataVariable
#-------------------------------------------------------------------------------------------
# Output
#-------------------------------------------------------------------------------------------
VERBOSE: Adding CollectionTime to all entries in array .... please wait !
Caption : Intel64 Family 6 Model 165 Stepping 5
Description : Intel64 Family 6 Model 165 Stepping 5
InstallDate :
Name : Intel(R) Core(TM) i7-10700 CPU @ 2.90GHz
Status : OK
Availability : 3
ConfigManagerErrorCode :
ConfigManagerUserConfig :
CreationClassName : Win32_Processor
DeviceID : CPU0
ErrorCleared :
ErrorDescription :
LastErrorCode :
PNPDeviceID :
PowerManagementCapabilities :
PowerManagementSupported : False
StatusInfo : 3
SystemCreationClassName : Win32_ComputerSystem
SystemName : STRV-MOK-DT-02
AddressWidth : 64
CurrentClockSpeed : 2904
DataWidth : 64
Family : 198
LoadPercentage : 1
MaxClockSpeed : 2904
OtherFamilyDescription :
Role : CPU
Stepping :
UniqueId :
UpgradeMethod : 1
Architecture : 9
AssetTag : To Be Filled By O.E.M.
Characteristics : 252
CpuStatus : 1
CurrentVoltage : 8
ExtClock : 100
L2CacheSize : 2048
L2CacheSpeed :
L3CacheSize : 16384
L3CacheSpeed : 0
Level : 6
Manufacturer : GenuineIntel
NumberOfCores : 8
NumberOfEnabledCore : 8
NumberOfLogicalProcessors : 16
PartNumber : To Be Filled By O.E.M.
ProcessorId : BFEBFBFF000A0655
ProcessorType : 3
Revision :
SecondLevelAddressTranslationExtensions : False
SerialNumber : To Be Filled By O.E.M.
SocketDesignation : U3E1
ThreadCount : 16
Version :
VirtualizationFirmwareEnabled : False
VMMonitorModeExtensions : False
VoltageCaps :
PSComputerName :
CollectionTime : 12-03-2023 16:08:33
RELATED LINKS
https://github.com/KnudsenMorten/AzLogDcrIngestPS
2 thoughts on “AzLogDcrIngestPS – tips & tricks sending data via Azure Pipeline, Azure Log Ingestion API, Azure Data Collections into Azure LogAnalytics”