🔥 Checkout our new Azure Developer page at azure.com/developer.
📺 Watch the video : How to query Azure Storage logs in Azure Monitor Log Analytics.
💡 Learn more : Overview of log queries in Azure Monitor
Azure Storage is one of the fundamental services in Azure that you probably use for a lot of different things in your applications. In Azure Storage, you can enable diagnostics logs, to be able to understand which operations where executed against the items in your storage account and how that went.
You can visualize the diagnostics logs in the Metrics menu of Azure Storage. This enables you to create charts from the metrics in the diagnostics logs. This is great, however, there isn't a way to query the data to actually analyze the logs out-of-the-box. We can use a slightly more complicated solution to query the diagnostics logs, by loading them into Azure Log Analytics. Let's take a look at how to do that.
To capture diagnostics from Azure Storage, we first need to enable diagnostic logging. This is very easy to do.
(Enable Azure Storage diagnostic logs in the Azure portal)
You can now see the logs in the Azure Storage Explorer. For instance, when you've enabled logging for Blobs, the logs will appear in a $logs container and are organized in folders that represent the log date. When you drill down on them and open a log file, you can see that each log entry is written as a separate line and that its values are separated by semicolons.
(Blob storage diagnostic logs in the Azure Storage Explorer)
Now that we have logs in Azure Storage, let's create a Log Analytics workspace to load them into and query them.
(Create a new Log Analytics workspace in the Azure portal)
Azure Log Analytics is a place where you can connect all sorts of services and diagnostic sources to, in order to monitor and analyze them. Out-of-the-box, you can connect most Azure resources to Log Analytics, including Azure Storage. However, at the time of writing this post, there is no out-of-the-box option to load the diagnostics logs into Log Analytics, so we have to do that ourselves.
We can load custom data into Log Analytics using the HTTP Data Collector API. But before we do, we need to convert the diagnostic logs into JSON format, as that is what the API expects. Luckily, there is a PowerShell script that we can use for that. This PowerShell script downloads the logs from Azure Storage, converts them into JSON and uploads them to Azure Log Analytics.
Before you execute the PowerShell script, you need to fill in the parameters that it needs. The script explains in detail what you need to fill in. You can execute the script locally or in the Azure Cloud Shell. You can use the Azure Cloud Shell from any OS, including from Mac OS, you just need a browser and an Azure subscription. And once you do, it starts downloading the logs, converting them into JSON and uploading them. This can take a while, depending on how many log files you have.
(Execute the PowerShell script)
Now that all of the data is in Azure Log Analytics, you can take advantage of its ability to query data.
MyStorageLogs1_CL | where request_start_time_t > datetime("2019-11-21") | where request_start_time_t < datetime("2019-11-23") | where request_status_s != "Success" | project request_start_time_t, request_status_s, operation_type_s, request_url_s
This query gets all of the operations that weren't successful within a certain time slot.
(Query log data in Azure Log Analytics in the Azure portal)
You can see the results as a list or render them into a chart (which requires you to include other columns in the results, like a number of total requests).
Azure Storage diagnostic logs contain a lot of valuable information, which isn't easy to extract without the right tools. When you use the querying capabilities of Azure Log Analytics, you can get valuable insights about Azure Storage, which you wouldn't get otherwise. Unfortunately, loading the storage logs into Log Analytics is a bit clumsy and requires a custom script. In the future, the logs might be a first-class citizen that can be connected to Log Analytics directly. For now, just follow the steps above. You can also automate running the script, by running it, let's say, every 24 hours using an Azure Logic App, so that you have the latest logs in Log Analytics every day. Go and try it out!