FinOps hub template Behind the scenes peek at what makes up the FinOps hub template, including inputs and outputs.
On this page
This template creates a new FinOps hub instance.
FinOps hubs include:
- Data Lake storage to host cost data.
- Data Factory for data processing and orchestration.
- Key Vault for storing secrets.
To use this template, you will need to create a Cost Management export that publishes cost data to the
msexports
container in the included storage account. See Create a new hub for details.
📋 Prerequisites
Please ensure the following prerequisites are met before deploying this template:
-
You must have the following permissions to create the deployed resources.
Resource Minimum RBAC Deploy and configure Data Factory1 Data Factory Contributor Deploy Key Vault1 Key Vault Contributor Configure Key Vault secrets1 Key Vault Administrator Create managed identity1 Managed Identity Contributor Deploy and configure storage1 Storage Account Contributor Assign managed identity to resources1 Managed Identity Operator Create deployment scripts1 Custom role containing only the Microsoft.Resources/deploymentScripts/write
andMicrosoft.ContainerInstance/containerGroups/write
permissions as allowed actions or, alternatively, Contributor, which includes these permissions and all the above rolesAssign permissions to managed identities1 Role Based Access Control Administrator or, alternatively, Owner, which includes this and all the above roles Create a subscription or resource group cost export2 Cost Management Contributor Create an EA billing cost export2 Enterprise Reader, Department Reader, or Enrollment Account Owner (Learn more) Create an MCA billing cost export2 Contributor Read blob data in storage3 Storage Blob Data Contributor 1. It is sufficient to assign hubs resources deployment permissions on the resource group scope.
2. Cost Management permissions must be assigned on the scope where you want to export your costs from.
3. Blob data permissions are required to access exported cost data from Power BI or other client tools. -
The Microsoft.EventGrid resource provider must be registered in your subscription. See Register a resource provider for details.
If you forget this step, the deployment will succeed, but the pipeline trigger will not be started and data will not be ready. See Troubleshooting Power BI reports for details.
📥 Parameters
Parameter | Type | Description | Default value |
---|---|---|---|
hubName | string | Optional. Name of the hub. Used to ensure unique resource names. | “finops-hub” |
location | string | Optional. Azure location where all resources should be created. See https://aka.ms/azureregions. | Same as deployment |
skipEventGridRegistration | bool | Indicates whether the Event Grid resource provider has already been registered (e.g., in a previous hub deployment). Event Grid RP registration is required. If not set, a temporary Event Grid namespace will be created to auto-register the resource provider. | false (register RP) |
EventGridLocation | string | Optional. Azure location to use for a temporary Event Grid namespace to register the Microsoft.EventGrid resource provider if the primary location is not supported. The namespace will be deleted and is not used for hub operation. | Same as location |
storageSku | String | Optional. Storage SKU to use. LRS = Lowest cost, ZRS = High availability. Note Standard SKUs are not available for Data Lake gen2 storage. Allowed: Premium_LRS , Premium_ZRS . | “Premium_LRS” |
tags | object | Optional. Tags to apply to all resources. We will also add the cm-resource-parent tag for improved cost roll-ups in Cost Management. | |
tagsByResource | object | Optional. Tags to apply to resources based on their resource type. Resource type specific tags will be merged with tags for all resources. | |
scopesToMonitor | array | Optional. List of scope IDs to monitor and ingest cost for. | |
exportRetentionInDays | int | Optional. Number of days of cost data to retain in the ms-cm-exports container. | 0 |
ingestionRetentionInMonths | int | Optional. Number of months of cost data to retain in the ingestion container. | 13 |
remoteHubStorageUri | string | Optional. Storage account to push data to for ingestion into a remote hub. | |
remoteHubStorageKey | string | Optional. Storage account key to use when pushing data to a remote hub. |
🎛️ Resources
The following resources are created in the target resource group during deployment.
Resources use the following naming convention: <hubName>-<purpose>-<unique-suffix>
. Names are adjusted to account for length and character restrictions. The <unique-suffix>
is used to ensure resource names are globally unique where required.
<hubName>store<unique-suffix>
storage account (Data Lake Storage Gen2)- Blob containers:
msexports
– Temporarily stores Cost Management exports.ingestion
– Stores ingested data.In the future, we will use this container to stage external data outside of Cost Management.
config
– Stores hub metadata and configuration settings. Files:settings.json
– Hub settings.schemas/focuscost_1.0.json
– FOCUS 1.0 schema definition for parquet conversion.schemas/focuscost_1.0-preview(v1).json
– FOCUS 1.0-preview schema definition for parquet conversion.
- Blob containers:
<hubName>-engine-<unique-suffix>
Data Factory instance- Pipelines:
msexports_ExecuteETL
– Queues themsexports_ETL_ingestion
pipeline to account for Data Factory pipeline trigger limits.msexports_ETL_transform
– Converts Cost Management exports into parquet and removes historical data duplicated in each day’s export.config_ConfigureExports
– Creates Cost Management exports for all scopes.config_StartBackfillProcess
– Runs the backfill job for each month based on retention settings.config_RunBackfillJob
– Creates and triggers exports for all defined scopes for the specified date range.config_StartExportProcess
– Gets a list of all Cost Management exports configured for this hub based on the scopes defined in settings.json, then runs each export using the config_RunExportJobs pipeline.config_RunExportJobs
– Runs the specified Cost Management exports.msexports_ExecuteETL
– Triggers the ingestion process for Cost Management exports to account for Data Factory pipeline trigger limits.msexports_ETL_transform
– Converts Cost Management exports into parquet and removes historical data duplicated in each day’s export.
- Triggers:
config_SettingsUpdated
– Triggers theconfig_ConfigureExports
pipeline when settings.json is updated.config_DailySchedule
– Triggers theconfig_RunExportJobs
pipeline daily for the current month’s cost data.config_MonthlySchedule
– Triggers theconfig_RunExportJobs
pipeline monthly for the previous month’s cost data.msexports_FileAdded
– Triggers themsexports_ExecuteETL
pipeline when Cost Management exports complete.
- Pipelines:
<hubName>-vault-<unique-suffix>
Key Vault instance- Secrets:
- Data Factory system managed identity
- Secrets:
In addition to the above, the following resources are created to automate the deployment process. The deployment scripts should be deleted automatically but please do not delete the managed identities as this may cause errors when upgrading to the next release.
- Managed identities:
<storage>_blobManager
(Storage Blob Data Contributor) – Uploads the settings.json file.<datafactory>_triggerManager
(Data Factory Contributor) – Stops triggers before deployment and starts them after deployment.
- Deployment scripts (automatically deleted after a successful deployment):
<datafactory>_deleteOldResources
– Deletes unused resources from previous FinOps hubs deployments.<datafactory>_stopTriggers
– Stops all triggers in the hub using the triggerManager identity.<datafactory>_startTriggers
– Starts all triggers in the hub using the triggerManager identity.<storage>_uploadSettings
– Uploads the settings.json file using the blobManager identity.
📤 Outputs
Output | Type | Description | |
---|---|---|---|
name | String | Name of the deployed hub instance. | |
location | String | Azure resource location resources were deployed to. | location |
dataFactorytName | String | Name of the Data Factory. | dataFactory.name |
storageAccountId | String | Resource ID of the storage account created for the hub instance. This must be used when creating the Cost Management export. | storage.outputs.resourceId |
storageAccountName | String | Name of the storage account created for the hub instance. This must be used when connecting FinOps toolkit Power BI reports to your data. | storage.outputs.name |
storageUrlForPowerBI | String | URL to use when connecting custom Power BI reports to your data. | 'https://${storage.outputs.name}.dfs.${environment().suffixes.storage}/${storage.outputs.ingestionContainer}' |
managedIdentityId | String | Object ID of the Data Factory managed identity. This will be needed when configuring managed exports. | dataFactory.identity.principalId |
managedIdentityTenantId | String | Azure AD tenant ID. This will be needed when configuring managed exports. | tenant().tenantId |