1
votes

I have my linux VMs installed with Linux Azure Diagnotics extension and configured to push syslog messages to Event Hub.

I can view my syslog messages on the event hub process data blade. Now I am trying to send these logs to Azure Data Explorer, for which I followed the below steps

  1. Create a cluster in ADX.
  2. Created a Database(Syslog) and table(SyslogTable) for storing syslog messages.
  3. Created the JSON mapping for Syslog Table mapping the fields conatined by event hub data.
  4. Created the data ingestion connection which connects Event Hub to ADX table.

Everything went fine without any errors as .show ingestion failures does not show any errors, but I am not able to see any data the ADX table.

Below are the sample configs.

Sample data viewed from Event Hub in Json format

{
    "time": "2020-05-18T15:54:01.0000000Z",
    "resourceId": "/subscriptions/xxxxx/resourceGroups/xxxx/providers/Microsoft.Compute/virtualMachines/vmname",
    "properties": {
      "ident": "systemd",
      "Ignore": "syslog",
      "Facility": "daemon",
      "Severity": "info",
      "EventTime": "2020-05-18T15:54:01.0000000",
      "SendingHost": "localhost",
      "Msg": "Removed slice User Slice of root.",
      "hostname": "vmname",
      "FluentdIngestTimestamp": "2020-05-18T15:54:01.0000000Z"
    },
    "category": "daemon",
    "level": "info",
    "operationName": "LinuxSyslogEvent",
    "EventProcessedUtcTime": "2020-05-19T07:39:48.5220591Z",
    "PartitionId": 0,
    "EventEnqueuedUtcTime": "2020-05-18T15:54:05.4390000Z"
  }

ADX Tables Schema

.create table SyslogTable (
eventTime: datetime,
resourceId: string,
properties: dynamic ,
category: string,
level: string,
operationName: string,
EventProcessedUtcTime: string,
PartitionId: int,
EventEnqueuedUtcTime: datetime
)

ADX Syslog Table mapping

.create table SyslogTable ingestion json mapping "SyslogMapping" 
'['
' {"column":"eventTime", "Properties": {"Path": "$.time"}},'
' {"column":"resourceId", "Properties": {"Path":"$.resourceId"}},'
' {"column":"properties", "Properties": {"Path":"$.properties"}},'
' {"column":"category", "Properties": {"Path":"$.category"}},'
' {"column":"level", "Properties": {"Path": "$.level"}},'
' {"column":"operationName", "Properties": {"Path": "$.operationName"}},'
' {"column":"EventProcessedUtcTime", "Properties": {"Path": "$.EventProcessedUtcTime"}},'
' {"column":"PartitionId", "Properties": {"Path": "$.PartitionId"}},'
' {"column":"EventEnqueuedUtcTime", "Properties": {"Path": "$.EventEnqueuedUtcTime"}}'
']'

Data Connection settings

Table: SyslogTable
Column Mapping: SyslogMapping
Data Format: Multiline Json/Json # tried with both

So anything I am missing here ?

2

2 Answers

1
votes

The issue that data was not being pushed to the ADX table was because I defined the $Default consumer group in the data connection settings and I was already using the $Default consumer group for fetching the events from EH elsewhere.

So the solution was simple to create a new consumer group for the Event Hub and create the new data connection.

0
votes

Nothing seems to be wrong with your ingestion mapping, when taking into account the table schema and the payload schema.

for example, if you'll run this - you'll see data gets ingested successfully

.ingest inline into table SyslogTable with(format=multijson, ingestionMappingReference='SyslogMapping') <|
{
    "time": "2020-05-18T15:54:01.0000000Z",
    "resourceId": "/subscriptions/xxxxx/resourceGroups/xxxx/providers/Microsoft.Compute/virtualMachines/vmname",
    "properties": {
      "ident": "systemd",
      "Ignore": "syslog",
      "Facility": "daemon",
      "Severity": "info",
      "EventTime": "2020-05-18T15:54:01.0000000",
      "SendingHost": "localhost",
      "Msg": "Removed slice User Slice of root.",
      "hostname": "vmname",
      "FluentdIngestTimestamp": "2020-05-18T15:54:01.0000000Z"
    },
    "category": "daemon",
    "level": "info",
    "operationName": "LinuxSyslogEvent",
    "EventProcessedUtcTime": "2020-05-19T07:39:48.5220591Z",
    "PartitionId": 0,
    "EventEnqueuedUtcTime": "2020-05-18T15:54:05.4390000Z"
}

to troubleshoot the issue you're facing, and assuming you've already guaranteed data is successfully being pushed to the EventHub, I'd recommend you open a support ticket for your resource via the azure portal.