0
votes

I have multiple VMs up and running on the Azure cloud (Citrix workers, fileservers, database servers, DC, etc.) on those citrix workers I have some applications running. So far so good, Now I want to gather logging information: Server health (memory usage, CPU, etc.), Application health (Memory usage application, startup application etc.) and db monitoring. I want to visualize the output in PowerBI containing both historical data (last 2 years, aggregated) as real time data (Server1: Running, 80% memory use, etc)

I was looking at the following setup: Event hubs - Stream Analytics - SQL server - PowerBI. First, for the server health, I found a web page that describes how to setup diagnostic settings for VMs but only via visual studio, is this not possible to do this via the portal (I'm doing a POC)? Secondly, I have a powershell script that gathers application information but for now, it's written to a csv file, does anybody have experience using the core.net objects in PowerShell to send the metrics directly to event hub? Third and last question, Is this the way to go or are there better setups for this kind of gathering of logs / metrics, like for example Log Analytics or VM insights (Preview) ...?

Thanks!

2

2 Answers

1
votes

Welcome to Stackoverflow!

Azure Stream Analytics is not the right choice for gathering the logs from multiple Azure Resources.

Reason:

Azure Stream Analytics is a real-time analytics and complex event-processing engine that is designed to analyze and process high volumes of fast streaming data from multiple sources simultaneously. Patterns and relationships can be identified in information extracted from a number of input sources including devices, sensors, clickstreams, social media feeds, and applications. These patterns can be used to trigger actions and initiate workflows such creating alerts, feeding information to a reporting tool, or storing transformed data for later use. Also, Stream Analytics is available on Azure IoT Edge runtime, and supports the same exact language or syntax as cloud.

The following image shows how data is sent to Stream Analytics, analyzed, and sent for other actions like storage or presentation:

enter image description here

The following scenarios are examples of when you can use Azure Stream Analytics:

  • Analyze real-time telemetry streams from IoT devices

  • Web logs/clickstream analytics

  • Geospatial analytics for fleet management and driverless vehicles

  • Remote monitoring and predictive maintenance of high value assets

  • Real-time analytics on Point of Sale data for inventory control and anomaly detection

Azure Monitor maximizes the availability and performance of your applications by delivering a comprehensive solution for collecting, analyzing, and acting on telemetry from your cloud and on-premises environments.

Azure provides a wide array of configurable security auditing and logging options to help you identify gaps in your security policies and mechanisms. This article discusses generating, collecting, and analyzing security logs from services hosted on Azure.

Hope this helps.

0
votes

Even i second Pradeep's answer Azure Stream Analytics can be too costly. Until and Unless you want to really do some real time analytics. I dont suggest for Azure Stream Analytics.

Azure Log Analytics can be a good alternative for your requirements to collect the Event Logs , Perf Logs , IIS Logs , etc... I have seen that you can almost collect the real time almos with the latency of 10 seconds.

1) On all your VM's(on prems/Azure doesnot matter) install the Log analytics MMA agent(https://docs.microsoft.com/en-us/azure/azure-monitor/platform/agent windows).

2) Configure the data which you want to collect(Even Logs, Custom Logs, Perf Counters, IIS logs,etc).

3) Data will start flowing to you Log Analytics instance instantly(20 mins).

4) You can also create the alerts and monitors.

@Jente : Find the sample code to insert the data to log analytics by creating the sample table.

    # This is the sample code for inserting the data to Log Analytics Table
    # https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api reference 

    #Get the maintainence windows and regions

    $CloudName = Read-Host -Prompt 'Insert the cloud Name for Maintainence'
    $starttime = Read-Host -Prompt 'Enter the start time of the maintainence(UTC)'
    $endtime   = Read-Host -Prompt 'Enter the end time of the maintainence(UTC)'


    # Replace with your Workspace ID
    $CustomerId = "XXXXXXXXXXXXXXXX"  

    # Replace with your Primary Key
    $SharedKey = "XXXXXXXXXXXXXXXX"

    # Specify the name of the record type that you'll be creating. You will be seeing the table with the below name
    $LogType = "AZStack_Maintainence_Window"

    # You can use an optional field to specify the timestamp from the data. If the time field is not specified, Azure Monitor assumes the time is the message ingestion time
    $TimeStampField = ""


    $json = @{CloudName=$cloudname;StartTime=$starttime;EndTime=$endtime}
    $json = ConvertTo-Json $json



    # Create two records with the same set of properties to create
    #$json = @
    #[{  "CloudName": $CloudName ,
    #    "Maintainence Start Time": $starttime,
    #    "Maintainence End Time": $endtime    
    #}
    #]
    #"@

    # Create the function to create the authorization signature
    Function Build-Signature ($customerId, $sharedKey, $date, $contentLength, $method, $contentType, $resource)
    {
        $xHeaders = "x-ms-date:" + $date
        $stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource

        $bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash)
        $keyBytes = [Convert]::FromBase64String($sharedKey)

        $sha256 = New-Object System.Security.Cryptography.HMACSHA256
        $sha256.Key = $keyBytes
        $calculatedHash = $sha256.ComputeHash($bytesToHash)
        $encodedHash = [Convert]::ToBase64String($calculatedHash)
        $authorization = 'SharedKey {0}:{1}' -f $customerId,$encodedHash
        return $authorization
    }


    # Create the function to create and post the request
    Function Post-LogAnalyticsData($customerId, $sharedKey, $body, $logType)
    {
        $method = "POST"
        $contentType = "application/json"
        $resource = "/api/logs"
        $rfc1123date = [DateTime]::UtcNow.ToString("r")
        $contentLength = $body.Length
        $signature = Build-Signature `
            -customerId $customerId `
            -sharedKey $sharedKey `
            -date $rfc1123date `
            -contentLength $contentLength `
            -method $method `
            -contentType $contentType `
            -resource $resource
        $uri = "https://" + $customerId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01"

        $headers = @{
            "Authorization" = $signature;
            "Log-Type" = $logType;
            "x-ms-date" = $rfc1123date;
            "time-generated-field" = $TimeStampField;
        }

        $response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing
        return $response.StatusCode

    }


    # Submit the data to the API endpoint
    Post-LogAnalyticsData -customerId $customerId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($json)) -logType $logType

Also find the sample code for inserting the log analytics data to SQL Server database. Same kind of sample should work for Azure SQL Database as well.

# Sample script to insert data to SQL from Log analytics
# it is just sample not for ready use for prod..

function QueryLogAnalytics()
{
write-Host "fetching the data from LA"
$query = "Perf | summarize by ObjectName, CounterName"
$queryResults = Invoke-AzOperationalInsightsQuery -WorkspaceId XXXXXXX  -Query $query
return $queryResults
}

function Login() 
{
$Subscription = 'XXXXXX'
try
{
    Write-Host "Logging into Azure and selecting subscription..."
    if ([string]::IsNullOrEmpty($(Get-AzContext).Account))
    {
        Connect-AzAccount
    }
    else
    {
        Write-Host "Existing Az session detected. Skipping login prompt."
    }

    Select-AzSubscription -Subscription $Subscription -ErrorAction Stop | Out-Null
}
catch
{
    Write-Error "Failed to login to Azure subscription with error $($_.Exception.Message)"
    Exit 1
}

}

#Main .. 

Login
$Results = QueryLogAnalytics | ConvertTo-Json #Converting to Json 
foreach($Result in $Results) 
{ 
Write-Host "Insert Results as Json"

$insertquery= "INSERT INTO [dbo].[Res] VALUES ('$Result')" 

Invoke-SQLcmd -ServerInstance 'ARUNKRALAP' -query $insertquery -Database Results 

}