Even i second Pradeep's answer Azure Stream Analytics can be too costly. Until and Unless you want to really do some real time analytics. I dont suggest for Azure Stream Analytics.
Azure Log Analytics can be a good alternative for your requirements to collect the Event Logs , Perf Logs , IIS Logs , etc... I have seen that you can almost collect the real time almos with the latency of 10 seconds.
1) On all your VM's(on prems/Azure doesnot matter) install the Log
analytics MMA
agent(https://docs.microsoft.com/en-us/azure/azure-monitor/platform/agent windows).
2) Configure the data which you want to collect(Even Logs, Custom Logs,
Perf Counters, IIS logs,etc).
3) Data will start flowing to you Log Analytics instance instantly(20 mins).
4) You can also create the alerts and monitors.
@Jente : Find the sample code to insert the data to log analytics by creating the sample table.
# This is the sample code for inserting the data to Log Analytics Table
# https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api reference
#Get the maintainence windows and regions
$CloudName = Read-Host -Prompt 'Insert the cloud Name for Maintainence'
$starttime = Read-Host -Prompt 'Enter the start time of the maintainence(UTC)'
$endtime = Read-Host -Prompt 'Enter the end time of the maintainence(UTC)'
# Replace with your Workspace ID
$CustomerId = "XXXXXXXXXXXXXXXX"
# Replace with your Primary Key
$SharedKey = "XXXXXXXXXXXXXXXX"
# Specify the name of the record type that you'll be creating. You will be seeing the table with the below name
$LogType = "AZStack_Maintainence_Window"
# You can use an optional field to specify the timestamp from the data. If the time field is not specified, Azure Monitor assumes the time is the message ingestion time
$TimeStampField = ""
$json = @{CloudName=$cloudname;StartTime=$starttime;EndTime=$endtime}
$json = ConvertTo-Json $json
# Create two records with the same set of properties to create
#$json = @
#[{ "CloudName": $CloudName ,
# "Maintainence Start Time": $starttime,
# "Maintainence End Time": $endtime
#}
#]
#"@
# Create the function to create the authorization signature
Function Build-Signature ($customerId, $sharedKey, $date, $contentLength, $method, $contentType, $resource)
{
$xHeaders = "x-ms-date:" + $date
$stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource
$bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash)
$keyBytes = [Convert]::FromBase64String($sharedKey)
$sha256 = New-Object System.Security.Cryptography.HMACSHA256
$sha256.Key = $keyBytes
$calculatedHash = $sha256.ComputeHash($bytesToHash)
$encodedHash = [Convert]::ToBase64String($calculatedHash)
$authorization = 'SharedKey {0}:{1}' -f $customerId,$encodedHash
return $authorization
}
# Create the function to create and post the request
Function Post-LogAnalyticsData($customerId, $sharedKey, $body, $logType)
{
$method = "POST"
$contentType = "application/json"
$resource = "/api/logs"
$rfc1123date = [DateTime]::UtcNow.ToString("r")
$contentLength = $body.Length
$signature = Build-Signature `
-customerId $customerId `
-sharedKey $sharedKey `
-date $rfc1123date `
-contentLength $contentLength `
-method $method `
-contentType $contentType `
-resource $resource
$uri = "https://" + $customerId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01"
$headers = @{
"Authorization" = $signature;
"Log-Type" = $logType;
"x-ms-date" = $rfc1123date;
"time-generated-field" = $TimeStampField;
}
$response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing
return $response.StatusCode
}
# Submit the data to the API endpoint
Post-LogAnalyticsData -customerId $customerId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($json)) -logType $logType
Also find the sample code for inserting the log analytics data to SQL Server database. Same kind of sample should work for Azure SQL Database as well.
# Sample script to insert data to SQL from Log analytics
# it is just sample not for ready use for prod..
function QueryLogAnalytics()
{
write-Host "fetching the data from LA"
$query = "Perf | summarize by ObjectName, CounterName"
$queryResults = Invoke-AzOperationalInsightsQuery -WorkspaceId XXXXXXX -Query $query
return $queryResults
}
function Login()
{
$Subscription = 'XXXXXX'
try
{
Write-Host "Logging into Azure and selecting subscription..."
if ([string]::IsNullOrEmpty($(Get-AzContext).Account))
{
Connect-AzAccount
}
else
{
Write-Host "Existing Az session detected. Skipping login prompt."
}
Select-AzSubscription -Subscription $Subscription -ErrorAction Stop | Out-Null
}
catch
{
Write-Error "Failed to login to Azure subscription with error $($_.Exception.Message)"
Exit 1
}
}
#Main ..
Login
$Results = QueryLogAnalytics | ConvertTo-Json #Converting to Json
foreach($Result in $Results)
{
Write-Host "Insert Results as Json"
$insertquery= "INSERT INTO [dbo].[Res] VALUES ('$Result')"
Invoke-SQLcmd -ServerInstance 'ARUNKRALAP' -query $insertquery -Database Results
}