Setting up Log Analytics workspace for production in enterprise

icon_1.0.1195.1535Operations and security are central in any cloud deployment. It should be top of mind in each of your cloud deployments.

Enabling your operations team to find and fix errors, to build practices around scaling your data are essential to having a successful Azure data center.

Log Analytics provides a unified way to show what is happening across your Azure data center.

In this article learn how to set up Log Analytics to receive data from multiple Azure subscriptions, on premises virtual machines or other clouds. And learn to configure your Log Analytics workspace, set up role-based-access-control, and how to incorporate Log Analytics best practices. In addition, you will also learn how to get started with some important queries.

Before you deploy resource groups and resources, you want to be able to measure and anlayze what is happening in our Azure data center. So before you start your deployments, it is a best practice to do these things:

  1. Deploy your Azure Log Analytics workspace
  2. Configure your Log Analytics workspace
  3. Deploy Security Center and have it send its logs to Azure Log Analytics. Learn about Security Center in the next article.

And then you can deploy your workloads.

In the next article, learn how to associate Azure Log Analytics with Security Center.


You should already have:


Let’s start with some definitions about the changing terminology around Azure Monitor and Log Analytics.

  • Azure Monitor log data is stored in a Log Analytics workspace.
  • The term Log Analytics is changing to be Azure Monitor logs.
  • Log analytics primarily applies to the page in the Azure portal used to write and run queries and analyze log data.
  • Log Analytics and Application Insights have been consolidated into Azure Monitor.
  • Log Analytics workspaces is where you create new workspaces and configure data sources.
  • Management solutions have been renamed to monitoring solutions

About Log Analytics workspaces

Azure Log Analytics is the primary tool in the Microsoft Azure portal for writing log queries and interactively analyzing their results. For those who have been around awhile, it was knows as OMS.

Azure Monitor and many resources in Azure stores log data in a Log Analytics workspace. The workspace is a central repository for that you can use to collect information from monitors and many other sources.

The following illustration shows how you collect data from multiple data sources and then use Log Analytics for alerts, analysis, and reports.


In this article you set up a shared Log Analytics workspace that is used across multiple applications, multiple subscriptions. You can think of it as a central workspace to monitor your the compute, network, and storage for your production environment.

See Designing your Azure Monitor Logs deployment for more information about how to design workspace, where your data is collected and aggregated.

Set up Log Analytics workspace

Log Analytics can collect data from across multiple Azure Monitors, application, subscriptions, and even on premises or operations information across clouds.

Select a pricing model based on the amount of data brought in, called per GB. This pricing model works best for containers and microservices where the definition of a node is less clear. “Per GB” data ingestion is the new basis for pricing across application, infrastructure, and networking monitoring.

In order to strictly control the access to the log analytics data, you may want to create a subscription for your operations team that contains Log Analytics and perhaps other sensitive data, such as Key Vault. When you set up your Log Analytics workspace, you can configure the other data sources to send the data to it — regardless of region to aggregate across subscriptions. The name for the Log Analytics workspace is unique across all of Azure, so it can be used to accept data from all of your resources.

Typically in an enterprise, you will have Azure Monitor data and data from Security Center and other resources providing data to a centralized Log Analytics workspace, as shown in the following illustration.


Architects will often set up a Log Analytics workspace for developers to monitor applications during development and then have a second one for production.

You may want to consider putting your production Log Analytics workspace in its own subscription so you can strictly control who has access, who can view data, and who receives alarms. Adding a minimal number of people who have access to the subscription can help guarantee the data in Log Analytics.

You can create the Log Analytics workspace using the portal, Azure CLI, or PowerShell. In this article, you will set up the Log Analytics workspace using PowerShell.

Set up Log Analytics workspace using PowerShell and an ARM template

Log into Azure using PowerShell. This can be on your local computer or in Azure Cloud Shell.

Then create a resource group to hold the Log Analytics workspace and its long term data. The following example shows how to create such a resource group, using your own SUBSCRIPTION_ID and the other parameters for your tags.

Next, you will use an ARM template to create a Log Analytics workspace, create a storage account to hold the Monitor log data, lock the storage account and the Log Analytics workspace resources from deletion. (You will learn more about ARM templates in later posts.)

To deploy, you can either set up a parameters file, or deploy with a script similar to the following:

Note that Log Analytics workspace is not available in every region, and Log Analytics does not not need to be colocated in any particular region with your resources. It may be a good idea to have your analytics in a separate region to help query and mitigate during outages.

Run the PowerShell script in this section to deploy the template. The template returns the workspace name that you will need in the next section.

Configure workspace

Next, to configure workspace, you connect log analytics to the resources you want to track.

In the portal, navigate to the Overview page of your newly created Log Analytics workspace as shown in the following illustration.


Azure provides out of the box Activity Logs. To add Activity Logs to Log Analytics, click the Azure Activity Logs link and select the subscriptions you want to analyze.

Platform logs and Platform metrics

Platform logs provide detailed diagnostic and auditing information for Azure resources and the Azure platform they depend on. They are automatically generated although you need to configure some of the platform logs to be forwarded to Log Analytics. Platform metrics are collected by default and typically stored in the Azure Monitor metrics database.

Each Azure resource requires its own diagnostic setting. The available categories will vary for different resource types.

You can send these data to Log Analytics, Event Hubs, or Storage Account or any combination.

This means that you will need to configure your resources to send their performance data and logs to Log Analytics as you build them.

You can set logs at different layers of Azure:

  • Resource logs. Operations within an Azure resource (the data plane), for example getting a secret from a Key Vault or making a request to a database. Resource logs were previously referred to as diagnostic logs.
  • Activity log. Operations on each Azure resource in the subscription from the outside (the management plane) in addition to updates on Service Health events.
  • Azure Active Directory logs. History of sign-in activity and audit trail of changes made in the Azure Active Directory.

To analyze these logs in Log Analytics, you need send platform logs to your Log Analytics workspace.

You can use PowerShell to create the diagnostics settings. For example, the following PowerShell code will send Key Vault diagnostics data to the Log Analytics workspace.

Or you can configure your workspace using an ARM template.

Configure your workspace using ARM Template.

You may want to connect your workspace to one or more of the following.

  1. Add solutions to the workspace
  2. Create saved searches. To ensure that deployments don’t override saved searches accidentally, an eTag property should be added in the “savedSearches” resource to override and maintain the idempotency of saved searches.
  3. Create saved function. The eTag should be added to override function and maintain idempotency.
  4. Create a computer group
  5. Enable collection of IIS logs from computers with the Windows agent installed
  6. Collect Logical Disk perf counters from Linux computers (% Used Inodes; Free Megabytes; % Used Space; Disk Transfers/sec; Disk Reads/sec; Disk Writes/sec)
  7. Collect syslog events from Linux computers
  8. Collect Error and Warning events from the Application Event Log from Windows computers
  9. Collect Memory Available Mbytes performance counter from Windows computers
  10. Collect IIS logs and Windows Event logs written by Azure diagnostics to a storage account
  11. Collect custom logs from Windows computer

Azure documentation provides a sample ARM template to deploy that performs each of the steps. The following sample shows how to configure Log Analytics workspace from the documentation to set each of the steps.

As owner of the workspace, you can soon tell whether you are able to see or manage sensitive data. If so, you can set up access control.

Set up access control

Not everyone should have access to the logs and be able to build queries. You can control access assigning groups to build in roles or use configure custom role based on your particular needs.

Using built-in roles for Log Analytics workspaces

Azure has two built-in user roles for Log Analytics workspaces:

  • Log Analytics Reader
  • Log Analytics Contributor

Members of the Log Analytics Reader role can:

  • View and search all monitoring data.
  • View monitoring settings, including viewing the configuration of Azure diagnostics on all Azure resources.

Members of the Log Analytics Contributor role can:

  • Includes all the privileges of the Log Analytics Reader role, allowing the user to read all monitoring data
  • Create and configure Automation accounts (permission must be granted at the resource group or subscription scope)
  • Add and remove management solutions (permission must be granted at the resource group or subscription scope)
  • Read storage account keys
  • Configure the collection of logs from Azure Storage
  • Edit monitoring settings for Azure resources, including
    • Adding the VM extension to VMs
    • Configuring Azure diagnostics on all Azure resources

The following PowerShell commmand shows how you can set up a group named Log Analytics Reader Group.

You can then assign users to the group to managed the access.

Using custom roles for Log Analytics workspaces

You may want to set up a custom role that combines permissions. This provides more granular control of who accesses data in Log analytics workspace. You can fine tune who has permissions to the data in workspace using workspace permissions, which you can can combine into custom roles. See Azure custom roles.

Role based access control best practices

Assign roles to security groups instead of individual users to reduce the number of assignments.

The following script sets a group named Log Analytics Reader Group to the resource group.

Users assigned to the Log Analytics Reader Group in Azure Active Directory would have Log Analytics Reader permissions in the resources within the resource group.

Set up your queries

Once you have data coming into Log Analytics, you will want to set up your queries. The art for Cloud Engineers is to build queries that provide visibility into your own cloud operations.

To get started, Azure provides some samples. To find the samples, navigate to your Log Analytics workspace. Click Logs in the General panel. You can find some example queries as shown in the following illustration.


Building queries in Log Analytics workspace

When you examine the queries, you can see they are very SQL-like. You can use Azure Data Explorer to explore the tables and columns available to you.

Azure Monitor Logs is based on >Azure Data Explorer, and log queries are written using the same Kusto query language (KQL). If you are familiar with SQL, Kusto queriers will be similar (see SQL to Kusto query translation for more details).

As an example, the following query counts how many rows in the Logs table have the value of the Level column equals the string Critical:

Use Log Analytics lab walkthrough tutorial

To get started using Log Analytics, use a demo environment that has includes sample data to learn from. With the demo environment, you won’t be able to save queries or pin results to a dashboard. But you can write queries and walk through the tutorial.

Next step

In this article, you learned how to set up Log Analytics Workspace and how to retrieve the workspace.

As you build each resource, you will want to consider how you will monitor and then send data to Log Analytics for your Cloud Engineers to analyze. You will want to associate your resources to Log Analytics as you deploy each resource.

Next steps:


2 thoughts on “Setting up Log Analytics workspace for production in enterprise

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s