
Azure Logger is a powerful tool that helps you efficiently log events in your Azure applications. It's designed to be easy to use and integrates seamlessly with Azure services.
To get started with Azure Logger, you'll need to install the Azure Logger NuGet package in your .NET project. This package provides the necessary components for logging events.
With Azure Logger, you can log events at different levels, including Debug, Info, Warn, Error, and Critical. This helps you track issues and debug your application more effectively.
Azure Logger also supports log filtering, which allows you to filter logs based on specific criteria, such as log level or category. This feature is particularly useful for large-scale applications where log data can be overwhelming.
Azure Logger Basics
Azure Monitor Logs provides you with the tools to collect, transform, and route data to tables in your Log Analytics workspace.
Event logs are output at one of three levels: Informational for request and response events, Warning for errors, and Verbose for detailed messages and content logging.
NoteContent logging is disabled by default. To enable it, see Log HTTP request and response bodies.
Here are the three levels of event logs:
- Informational: request and response events
- Warning: errors
- Verbose: detailed messages and content logging
Information
Information is key when it comes to monitoring and logging Azure resources. Azure Monitor Logs provides tools to collect, transform, and route data to tables in your Log Analytics workspace.
Event logs are output at one of three levels: Informational for request and response events, Warning for errors, and Verbose for detailed messages and content logging. This helps you stay on top of what's happening in your Azure environment.
The Azure Monitor Logs API can be used to collect logs from Azure resources, and Log360 is a Microsoft Azure logging tool that can collect logs from devices and applications in your Azure cloud infrastructure. Log360 securely collects logs from the Azure environment and helps detect performance bottlenecks.
Log information includes unique IDs, HTTP methods, URIs, outgoing request headers, and more. The SDK logs each HTTP request and response, sanitizing parameter query and header values to remove personal data.
Here are the different types of log information:
Log360 can collect logs from a wide range of Azure resources and environments, including Azure activity logs, diagnostic logs, and Azure Security Center for security event monitoring.
Built-in Methods
The Azure SDK for .NET's client libraries log events to Event Tracing for Windows (ETW) via the System.Diagnostics.Tracing.EventSource class.
You can use the Azure.Core.Diagnostics.AzureEventSourceListener class to simplify comprehensive logging for your .NET app. This class contains two static methods: CreateConsoleLogger and CreateTraceLogger.
These methods accept an optional parameter that specifies a log level. If the parameter isn't provided, the default log level of Informational is used.
Directives
Directives are a crucial part of setting up an Azure Logger, and understanding them will save you a lot of time and headaches.
There are three modes to choose from: Table, Blob, and Analytics. Each mode has its own set of mandatory directives that you need to specify.
In Table mode, you'll need to define the AuthKey, StorageName, and TableName directives. The AuthKey directive will become deprecated from NXLog Enterprise Edition 6.0, so make sure to use the SharedKey directive instead.
Here's a quick rundown of the directives for each mode:
In Blob mode, you'll also need to specify the BlobName and SharedKey directives. Make sure to use the correct format for the BlobName, which can be either a blob container or a single blob path.
In Analytics mode, you'll need to define the ClientID, SharedKey, TenantID, and WorkspaceID directives. The AuthKey directive will also become deprecated from NXLog Enterprise Edition 6.0, so make sure to use the SharedKey directive instead.
Remember to check the documentation for any specific requirements or restrictions for each mode.
Microsoft Sentinel and Microsoft Defender for Cloud
Microsoft Sentinel and Microsoft Defender for Cloud are two powerful services that help with security monitoring in Azure. They store their data in Azure Monitor Logs, allowing for analysis with other log data collected by Azure Monitor.
Microsoft Sentinel and Microsoft Defender for Cloud are designed to work together seamlessly, providing a comprehensive security solution for Azure users. Microsoft Sentinel collects and analyzes security-related data, while Microsoft Defender for Cloud focuses on cloud security posture and threat protection.
Azure Monitor Logs plays a crucial role in this process, as it stores the data from both Microsoft Sentinel and Microsoft Defender for Cloud. This enables users to analyze and gain insights from the combined data, improving their overall security posture.
Here's an interesting read: Azure App Insights vs Azure Monitor
Configuration and Setup
To configure custom logging with the Azure SDK for .NET, you can register event listeners to receive log messages. This can be done by constructing an instance of the AzureEventSourceListener class, passing it a callback method, and specifying the log levels to include.
You can also configure Azure web application logging and storage from the Azure Management Portal by creating a new storage account, specifying the storage account name, region, and redundancy type, and then enabling application logging (Blob).
For Analytics mode, you need to register an Azure Active Directory application and grant it the necessary permissions to read from the Log Analytics workspace. This involves creating a new app registration, selecting the Log Analytics API, and adding the app to your Log Analytics workspace.
Configure Custom
You can configure custom logging by registering event listeners to receive log messages from the Azure SDK for .NET. This allows you to process log messages in a way that suits your needs.
The AzureEventSourceListener class is a key part of this process, and you can pass it a callback method that will receive log messages. You can also specify the log levels to include when constructing the instance.

To log to the console with a custom message, you can create an event listener that filters logs to those events emitted from the Azure Core client library with a level of verbose. The Azure Core library uses an event source name of Azure-Core.
Custom logging can be a powerful tool for understanding what's happening in your application. By processing log messages in a way that makes sense to you, you can gain valuable insights and troubleshoot issues more effectively.
You can also create your own visualizations and reports using workbooks, dashboards, and Power BI. This allows you to present data in an intuitive way and monitor the performance and availability of your cloud and hybrid applications.
Storage Setup
To set up storage for your Azure web application logging, you'll need to create a new storage account in the Azure Management Portal. This is done by following these steps: create a new storage account, providing a storage account name, region, and redundancy type.
The storage account name is crucial, as you'll need to specify it in the NXLog configuration. You can find more information on storage settings in the Microsoft documentation on how to Create a storage account.
Once you've created the storage account, navigate to your app and select App Service logs. From there, you can select On for Application Logging (Blob) and configure the Storage Settings corresponding with the storage account you created.
After configuring the storage settings, confirm the changes by clicking Save, then restart the service. Note that it may take a while for Azure to create the table and/or blob in the storage.
Here's a step-by-step guide to creating a new storage account:
- Create the new storage account, providing a storage account name, region, and redundancy type.
- Navigate to your app and select App Service logs.
- Select On for Application Logging (Blob).
- Configure Storage Settings corresponding with the storage account created.
- Confirm the changes by clicking Save, then restart the service.
ASP.NET Core and Client Registration
To log events in ASP.NET Core, you'll need to register the Azure SDK library's client with the Dependency Injection (DI) container. You can do this by installing the Microsoft.Extensions.Azure NuGet package and calling the AddAzureClients extension method in your Program.cs file.
For example, you can add the ServiceBus client by calling `builder.Services.AddAzureClients(azureBuilder => { azureBuilder.AddServiceBusClient(builder.Configuration.GetConnectionString("ServiceBus")); azureBuilder.UseCredential(new DefaultAzureCredential()); });`. This will register the ServiceBus client with the DI container.
Alternatively, if you can't or don't need to register the client with the DI container, you can use the AzureEventSourceLogForwarder service to forward Azure SDK library events to the Azure Monitor. To do this, install the Microsoft.Extensions.Azure NuGet package and register the log forwarder service as a singleton in the DI container.
ASP.NET Core Mapping
ASP.NET Core Mapping is a crucial aspect of client registration, allowing you to leverage the standard ASP.NET Core logging configuration for logging. The AzureEventSourceLogForwarder service enables this by forwarding log messages from Azure SDK event sources to ILoggerFactory.
The mapping between Azure SDK EventLevel and ASP.NET Core LogLevel is straightforward: Critical maps to Critical, Error maps to Error, Informational maps to Information, Warning maps to Warning, Verbose maps to Debug, and LogAlways maps to Information.
Here's a handy table to refer to:
By using this mapping, you can easily integrate your Azure SDK logging with your ASP.NET Core logging configuration.
Advanced Features
The Azure Logger is packed with advanced features that make logging and monitoring a breeze. One of the key features is the ability to handle large volumes of logs, with support for up to 1 million log entries per minute.
This feature is particularly useful for high-traffic applications, where logs can quickly add up. You can also configure Azure Logger to forward logs to multiple destinations, including Azure Monitor, Event Hub, and Log Analytics.
With Azure Logger, you can also create custom log levels and categories, allowing you to fine-tune your logging settings to suit your specific needs. This feature is a game-changer for developers who need to troubleshoot complex issues.
Table Mode Directives
Table Mode Directives are a crucial part of setting up your Azure Storage connection in NXLog Enterprise Edition.
The AuthKey directive is a mandatory directive when using Table mode, but it's worth noting that it will become deprecated from NXLog Enterprise Edition 6.0.

You'll need to use the SharedKey directive instead to specify the authentication key to use for connecting to the Azure Storage account.
The SharedKey directive is used to define the authentication key, and it's the only option available in NXLog Enterprise Edition 6.0 and later.
StorageName is another mandatory directive that specifies the name of the storage account to connect to.
You'll also need to specify the TableName directive, which specifies the storage table from which to collect logs.
Here's a summary of the mandatory directives you'll need to use in Table mode:
Built-in Insights and Custom Dashboards
Azure Monitor's ready-to-use, curated Insights experiences store data in Azure Monitor Logs, presenting this data in an intuitive way so you can monitor the performance and availability of your cloud and hybrid applications.
You can create custom visualizations and reports using workbooks, dashboards, and Power BI to suit your specific needs.
Many of these Insights experiences are ready-to-use, saving you time and effort in setting up monitoring for your applications.
You can also use workbooks to create your own visualizations and reports, giving you a high degree of flexibility and control.
Frequently Asked Questions
How to capture logs in Azure?
To capture logs in Azure, use a log query in the Queries window to retrieve logs for your resource type. Select a prebuilt query, identify it, and click Run to start the process.
What are Azure logs?
Azure logs are a collection of data generated by your cloud and on-premises resources and applications. They provide valuable insights into your system's performance and behavior, helping you identify issues and optimize your infrastructure.
Sources
- https://learn.microsoft.com/en-us/dotnet/azure/sdk/logging
- https://docs.nxlog.co/refman/current/im/azure.html
- https://stackoverflow.com/questions/67244690/how-do-i-enable-info-logging-using-the-azure-core-logger-for-javascript
- https://learn.microsoft.com/en-us/azure/azure-monitor/logs/data-platform-logs
- https://www.manageengine.com/log-management/azure-log-analytics.html
Featured Images: pexels.com