Sync (remote cache)
Overview
The cloudpack sync command downloads package bundles into the Cloudpack cache ahead of time. This improves start times significantly as it greatly reduces the number of packages that must be bundled locally while starting the inner loop.
The sync process has two major components:
- Upload bundles to the remote cache (
cloudpack sync --upload), usually during a CI build. - Download bundles to the developer's machine (
cloudpack sync), usually as apostinstallstep.
There are a few steps for enabling sync:
- Create the Azure resources and configure permissions
- Configure Cloudpack
- Enable downloading bundles for developers
- Enable uploading bundles in CI
Currently, Cloudpack only supports Azure blob storage as the remote cache storage location.
1. Storage and permissions setup
- Azure storage account setup: Create a storage account with a dedicated container for Cloudpack asset storage.
- Lifecycle management policies: Use lifecycle management policies to optimize your asset storage efficiency. These policies automatically clean up unused cached assets after a predefined number of days. This proactive approach helps you reduce the cost of storage.
- Developer access permissions: For security, grant developers the "Storage Blob Data Reader" role for only the specific container in the storage account. It's easiest to manage this using a Microsoft Entra security group.
Automating setup with Bicep templates
To automate the above steps, you can leverage Bicep templates. These templates provide a declarative way to define your Azure infrastructure.
Here's an example of using Bicep templates for Azure storage account and container setup:
// Storage account name must be between 3 and 24 characters in length and use numbers and lower-case letters only.
param storageAccountName string = '<Your Storage Account Name>'
// Cache versioning is used to invalidate the cache when the cache configuration changes.
param containerName string = '<Container Name>-v0'
// The storage account will in the same region as the resource group.
param location string = resourceGroup().location
// Create the storage account
// https://learn.microsoft.com/en-us/azure/templates/microsoft.storage/storageaccounts?pivots=deployment-language-bicep
resource storageAccount 'Microsoft.Storage/storageAccounts@2022-09-01' = {
name: storageAccountName
location: location
sku: {
name: 'Standard_LRS'
}
kind: 'StorageV2'
properties: {
supportsHttpsTrafficOnly: true
}
}
// Explicitly represent the blob service (though this is just symbolic)
// https://learn.microsoft.com/en-us/azure/templates/microsoft.storage/storageaccounts/blobservices?pivots=deployment-language-bicep
resource blobService 'Microsoft.Storage/storageAccounts/blobServices@2022-09-01' = {
name: 'default'
parent: storageAccount
properties: {}
}
// Create the blob container in the blob service
// https://learn.microsoft.com/en-us/azure/templates/microsoft.storage/storageaccounts/blobservices/containers?pivots=deployment-language-bicep
resource blobContainer 'Microsoft.Storage/storageAccounts/blobServices/containers@2022-09-01' = {
name: containerName
parent: blobService
properties: {}
}
// Create the management policy for the storage account
// https://learn.microsoft.com/en-us/azure/templates/microsoft.storage/storageaccounts/managementpolicies?pivots=deployment-language-bicep
resource lifecyclePolicy 'Microsoft.Storage/storageAccounts/managementPolicies@2022-09-01' = {
name: 'default'
parent: storageAccount
properties: {
policy: {
rules: [
{
name: 'CleanupOldBlobs'
enabled: true
type: 'Lifecycle'
definition: {
actions: {
baseBlob: {
delete: {
daysAfterModificationGreaterThan: 30
}
}
}
filters: {
blobTypes: [
'blockBlob'
]
prefixMatch: [
'${containerName}/'
]
}
}
}
]
}
}
}
// Storage blob data reader role (standard value)
var storageBlobDataReaderRoleId = '2a2b9908-6ea1-4ae2-8e65-a410df84e7d1'
// Replace with your Microsoft Entra security group object ID
var developersSecurityGroupObjectId = '<...>'
// Add a role assignment for the group to access the container
// https://learn.microsoft.com/en-us/azure/templates/microsoft.authorization/roleassignments?pivots=deployment-language-bicep
resource storeAccountReaderRoleAssignment 'Microsoft.Authorization/roleAssignments@2022-04-01' = {
name: guid(storageAccount.id, storageBlobDataReaderRoleId, developersSecurityGroupObjectId)
scope: storageAccount
properties: {
roleDefinitionId: subscriptionResourceId('Microsoft.Authorization/roleDefinitions', storageBlobDataReaderRoleId)
principalId: developersSecurityGroupObjectId
principalType: 'Group'
}
}
output storageAccountId string = storageAccount.id
output blobServiceId string = blobService.id
output blobContainerId string = blobContainer.idThen create a script like this to deploy the template. This can be run manually or automatically as part of a deployment pipeline. (Alternatively, you can use Azure Pipelines ARM template deployment tasks.)
#!/bin/bash
# Fail on error
set -euo pipefail
# Variables
resourceGroupName="<your resource group name>"
bicepFilePath="<your bicep file name>.bicep"
subscriptionId="<your Azure subscription ID>"
# Validate the Bicep template
validationResult=$(az deployment group validate \
--subscription $subscriptionId \
--resource-group $resourceGroupName \
--template-file $bicepFilePath \
--query 'error' \
--output tsv)
echo "Validation passed. Deploying the Bicep template..."
az deployment group create \
--subscription $subscriptionId \
--resource-group $resourceGroupName \
--template-file $bicepFilePath
echo "Deployment complete."2. Cloudpack configuration
Add the following configuration section into cloudpack.config.json, replacing the account and container names with your own. See the RemoteCacheConfig docs for details and additional options.
{
"features": {
// Required feature flag to enable sync
"syncBundles": true,
// Optional features: see below
// "syncInternalPackages": true,
// "enableRemoteCacheDownloads": true
},
"remoteCache": {
"storageAccount": "YOUR STORAGE ACCOUNT NAME",
"container": "YOUR CONTAINER NAME",
},
}Optional features
The following flags can optionally be enabled under features:
syncInternalPackages: Enable syncing of package from within your own repo. This potentially allows for faster startup (especially if many of your repo's packages don't change often) but has the potential to create much more cache churn and increase download times.enableRemoteCacheDownloads: Enables downloading bundles directly from the remote cache when available.
3. Downloading bundles for developers
Generally, the best way to populate the local Cloudpack cache is by running cloudpack sync in the postinstall step. This way, Cloudpack will download the cached resources when the npm/yarn installation is completed, and developers will not need to run a separate command to populate the Cloudpack cache.
Add the following line into your monorepo's root package.json (or your application package.json if you do not use a monorepo):
{
"scripts": {
"postinstall": "cloudpack sync"
}
}This will prompt the user to log in the first time the script runs (see sync command docs for details).
4. Uploading bundles in CI
To populate the remote cache, run cloudpack sync --upload in a CI pipeline, such as the build for main/master.
The following instructions cover integration with Azure DevOps (ADO), but it's likely possible with GitHub Actions too.
Configure Azure service connection
Start by creating a new service connection of the type Azure Resource Manager in your ADO project settings. This will allow Cloudpack to authenticate with Azure.
Ensure the pipeline that will execute cloudpack sync --upload has permission to use the service connection.
Add to pipeline
Integrate the YAML snippet below into your pipeline configuration file. Notes:
azureSubscription: Replace with your specific service connection's name.az account show: Useful for diagnosing the connected account.cloudpack sync --non-interactive-login: Log in using the service connection and download previously-cached assets.cloudpack sync --upload --non-interactive-login: Bundle any unbundled packages and upload them to Azure storage.
- task: AzureCLI@2
displayName: Cloudpack Sync
inputs:
azureSubscription: 'YOUR SERVICE CONNECTION NAME'
scriptType: bash
scriptLocation: inlineScript
inlineScript: |
az account show
yarn cloudpack sync --non-interactive-login
yarn cloudpack sync --upload --non-interactive-login