Section 11 - Deploy Azure SQL
Requirements
The Azure Administrator will be required for this step. The user must have:
Ownerrole1 for the subscription- Access to the Azure Portal
Manual Steps:
Video Walkthrough
Step 1: Create SQL Database
- Browse and log into Azure Portal
Use the correct URL for your environment:
Worldwide (Commercial) & GCC
https://portal.azure.comGCC-High and DoD
https://portal.azure.us
- Type in
azure sqlin the top search textbox - Select
Azure SQL Databasefrom the search results
- Click on
Create
- Set the following properties, and click on
Create newunder theServerproperty
| Name | Value |
|---|---|
| Subscription | The subscription created for this toolkit |
| Resource Group | sub-spark-rg |
| Database Name | sqldb-spark |
- Set the following properties and click on
OK
| Name | Value |
|---|---|
| Server Name | sub-sql-spark-svr |
| Location | The same region you have been using for the Azure services |
| Authentication Method | Use Microsoft Entra-only Authentication |
| Set Microsoft Entra admin | SPARK System Admins |
- Set the following properties, and click on
Next
For Tenant Sizes: < 50k sites
| Name | Value |
|---|---|
| Compute + storage | Standard S0 2GB |
| Backup storage redundancy | Locally-redundant backup storage |
For Tenant Sizes: < 100k sites
| Name | Value |
|---|---|
| Compute + storage | Standard S1 5GB |
| Backup storage redundancy | Locally-redundant backup storage |
For Tenant Sizes: > 100k sites
| Name | Value |
|---|---|
| Compute + storage | Standard S2 5GB |
| Backup storage redundancy | Locally-redundant backup storage |
For tenants > 150k sites during intake and inventory runbook jobs, it’s recommended to update the compute and storage to Premium P1. Return to the Standard S1 after the jobs have completed.
- Set the following properties, and click on
Next
| Name | Value |
|---|---|
| Connectivity method | Public endpoint |
| Allow Azure services and resources to access this server | Yes |
- Click on
Configure Identitiesfor theServer identityproperty - Add the
uami-spark-spoactionsidentity - Select the
uami-spark-spoactionsidentity as thePrimary Identity - Click on
Apply
If this section is not available during creation, you can set this property later.
- Set
Enable secure enclavestoOn - Click on
NextuntilReview + create - Click on
Create
Step 2: Configure SQL Access Control
- Access the
Azure SQLlogical servers - Click on
SQL logical servers - Select the
sub-sql-spark-svrserver created in the previous step - Select the
Access control (IAM)and click onAdd role assignment
- Search for
sql db - Select
SQL DB Contributorand click onNext
- Select
Managed identityfor theAssign access to - Select the
uami-spark-spoactionsidentity, and click onSelect - Click on
Review + assign, and thenReview + assignto add the role assignment
Step 3: Configure SQL UAMI Identity
- Select
SQL databasesfrom the left navigation - Select the
sub-sql-spark-svrdatabase - Under
Security, click onIdentity - Select
Add, select theuami-spark-spoactionsidentity and then click onAdd - Click
Save
Step 4: Get SQL Connection String
- Click on
SQL databasesfrom the left navigation - Select the
sqldb-sparkdatabase - Under
Overview, click onSee connection strings - Annotate the
Microsoft Entra passwordless authenticationdatabase connection string
Update Connection String
Replace TrustServerCertificate=False;Connection Timeout=30;Authentication="Active Directory Default"; with MultipleActiveResultSets=True;
1
Server=tcp:sub-sql-spark-svr-qa1.database.windows.net,1433;Initial Catalog=sub-sql-spark-svr-qa1;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;Authentication="Active Directory Default";
1
Server=tcp:sub-sql-spark-svr.database.windows.net,1433;Initial Catalog=sqldb-spark;Encrypt=True;MultipleActiveResultSets=True;
Annotate the following variables in the template spreadsheet:
- v_sqlConnectionString: The sql connection string
Step 5: Connect to SQL Database
- Click on
SQL databasesfrom the left navigation - Select the
sqldb-sparkdatabase - Under
Query editor, click onAllow IP [IP Address]...to be able to connect to the database - Click on
Continue as [user account]to access the query editor
The user must be added as a Member part of the SPARK System Admins group.
Step 6: Run SQL Queries
- From the query editor, run the following scripts in the order shown in the table below
- Confirm each query runs successfully
| Name | Files |
|---|---|
| Create Users | 01_create_spark_sql_users |
| Create Tables | 02_create_spark_sql_tables |
| Create Views | 03_create_spark_sql_views |
| Create Stored Procedures | 04_create_spark_sql_procs |
Continue to creating the Function Apps