Restoring SQL Server BACPAC Files on Mac M3

Overview

This guide covers how to restore a .bacpac database backup file to SQL Server running in Docker on Mac M3 (Apple Silicon) using SqlPackage.

Prerequisites

  • SQL Server running in Docker
  • .bacpac backup file
  • Terminal access
  • Docker container running and accessible

Installation Steps

1. Download SqlPackage for macOS

SqlPackage is Microsoft’s command-line utility for importing and exporting SQL Server databases.

# Navigate to Downloads folder
cd ~/Downloads

# Download SqlPackage for macOS (universal binary - works on ARM64/M3)
curl -L -o sqlpackage.zip https://aka.ms/sqlpackage-macos

# Extract the package
unzip sqlpackage.zip -d sqlpackage

# Make the binary executable
chmod +x sqlpackage/sqlpackage

Restoring a BACPAC File

Basic Restore Command

./sqlpackage/sqlpackage /Action:Import \
  /SourceFile:"your-database.bacpac" \
  /TargetServerName:"localhost,PORT" \
  /TargetDatabaseName:"YourDatabaseName" \
  /TargetUser:"sa" \
  /TargetPassword:'YourPassword' \
  /TargetTrustServerCertificate:True

Example Usage

./sqlpackage/sqlpackage /Action:Import \
  /SourceFile:"SampleDatabase.bacpac" \
  /TargetServerName:"localhost,5433" \
  /TargetDatabaseName:"ESOLAut0mater12" \
  /TargetUser:"sa" \
  /TargetPassword:'SecureP@ssword' \
  /TargetTrustServerCertificate:True

Example with Extended Timeout (for large databases)

./sqlpackage/sqlpackage /Action:Import \
  /SourceFile:"SampleDatabase.bacpac" \
  /TargetServerName:"localhost,5433" \
  /TargetDatabaseName:"ESOLAut0mater12" \
  /TargetUser:"sa" \
  /TargetPassword:'SecureP@ssword' \
  /TargetTrustServerCertificate:True \
  /TargetTimeout:600

Important Parameters Explained

Parameter Description
/Action:Import Specifies that we’re importing a BACPAC file
/SourceFile Path to your .bacpac file
/TargetServerName Server address with port (format: hostname,port)
/TargetDatabaseName Name for the restored database
/TargetUser SQL Server username (typically sa)
/TargetPassword SQL Server password
/TargetTrustServerCertificate:True Required for Docker SQL Server – trusts self-signed certificates
/TargetTimeout:600 Optional – Connection timeout in seconds (use for large databases)

Common Issues and Solutions

1. SSL Certificate Error

Error: The remote certificate was rejected by the provided RemoteCertificateValidationCallback

Solution: Add /TargetTrustServerCertificate:True parameter (already included in the example above)

2. Password with Special Characters

If your password contains special characters like !, @, #, etc., wrap it in single quotes:

/TargetPassword:'SecureP@ssword'

3. Connection Format

  • ✅ Correct: localhost,5433 (comma separator)
  • ❌ Incorrect: localhost;5433 (semicolon separator)
  • ❌ Incorrect: localhost:5433 (colon separator)

4. Port Numbers

Common SQL Server ports:

  • Default: 1433
  • Custom Docker: 5433 (or whatever you mapped in your Docker run command)

Find your Docker port mapping:

docker ps
# Look for something like: 0.0.0.0:5433->1433/tcp

5. Transport-Level Errors with Large Tables

Error: A transport-level error has occurred when receiving results from the server

Symptoms:

  • Import fails consistently at the same tables (often audit or log tables)
  • Error message: Transport-level error or Invalid argument
  • Import works up to a certain point, then crashes

Root Cause: Some tables (like AbpAuditLogs, AbpEntityChanges, AbpEntityPropertyChanges) can contain massive amounts of data that cause timeout or memory issues during bulk import on Mac M3 with Docker SQL Server.

Solution: Manually Remove Large Table Data from BACPAC

Since BACPAC is essentially a ZIP file, you can extract it, remove problematic table data, and re-package it.

Step-by-Step Process
# Navigate to your BACPAC location
cd ~/Downloads

# Remove previous folders if they exist
rm -rf ESOLAut0mater_Modified.bacpac
rm -rf bacpac_extracted

# Unzip the BACPAC
unzip YourDatabase.bacpac -d bacpac_extracted

# Navigate to the Data folder
cd bacpac_extracted/Data

# Remove the large table data files (BCP files contain the actual data)
# Common problematic tables (adjust based on your error messages):
rm -rf dbo.AbpAuditLogs/*.BCP
rm -rf dbo.AbpEntityChanges/*.BCP
rm -rf dbo.AbpEntityPropertyChanges/*.BCP

# Add more tables if needed:
# rm -rf dbo.YourLargeTable/*.BCP

# Return to parent directory and re-zip
cd ..
zip -r ../YourDatabase_Modified.bacpac *

# Go back to Downloads folder
cd ..

# Import the modified BACPAC
./sqlpackage/sqlpackage /Action:Import \
  /SourceFile:"YourDatabase_Modified.bacpac" \
  /TargetServerName:"localhost,5433" \
  /TargetDatabaseName:"YourDatabaseName" \
  /TargetUser:"sa" \
  /TargetPassword:'YourPassword' \
  /TargetTrustServerCertificate:True \
  /TargetTimeout:600
Complete Example
cd ~/Downloads

# Clean up previous attempts
rm -rf ESOLAut0mater_Modified.bacpac
rm -rf bacpac_extracted

# Unzip the BACPAC
unzip SampleDatabase.bacpac -d bacpac_extracted

# Remove problematic table data
cd bacpac_extracted/Data
rm -rf dbo.AbpAuditLogs/*.BCP
rm -rf dbo.AbpEntityChanges/*.BCP
rm -rf dbo.AbpEntityPropertyChanges/*.BCP

# Re-zip it
cd ..
zip -r ../ESOLAut0mater_Modified.bacpac *

# Import the modified BACPAC
cd ..
./sqlpackage/sqlpackage /Action:Import \
  /SourceFile:"ESOLAut0mater_Modified.bacpac" \
  /TargetServerName:"localhost,5433" \
  /TargetDatabaseName:"ESOLAut0mater12" \
  /TargetUser:"sa" \
  /TargetPassword:'SecureP@ssword' \
  /TargetTrustServerCertificate:True \
  /TargetTimeout:600
How to Identify Problematic Tables
  1. Look at the import output – it shows which table was processing when it failed:

    Processing Table '[dbo].[AbpAuditLogs]'.
    *** A transport-level error has occurred...
    
  2. Check the Data folder structure in the extracted BACPAC:

    cd bacpac_extracted/Data
    du -sh */ | sort -h
    # This shows folder sizes - large folders are likely culprits
    
Important Notes
  • Schema Preservation: This method only removes table DATA, not the table structure. The tables will exist but will be empty.
  • Foreign Keys: If other tables reference the data you’re removing, you may need to handle those dependencies.
  • Audit Tables: Tables like AbpAuditLogs, AbpEntityChanges, AbpEntityPropertyChanges are typically audit/log tables and are safe to empty for development environments.
  • Production Warning: Do not use this method for production imports where you need complete data integrity.
  • Alternative: If you need the data from these tables, consider using Azure Data Studio which sometimes handles large imports better, or increase Docker memory allocation significantly (8GB+).
Verify After Import

After importing the modified BACPAC, verify your data:

USE YourDatabaseName;
GO

-- Check table count
SELECT COUNT(*) AS TotalTables FROM sys.tables;

-- Check row counts for all tables
SELECT 
    SCHEMA_NAME(t.schema_id) + '.' + t.name AS TableName,
    SUM(p.rows) AS RowCount
FROM sys.tables t
INNER JOIN sys.partitions p ON t.object_id = p.object_id
WHERE p.index_id IN (0, 1)
GROUP BY t.schema_id, t.name
ORDER BY RowCount DESC;

-- Verify important business tables have data
SELECT COUNT(*) FROM [YourSchema].[YourImportantTable];

Optional: Install Globally

To use SqlPackage from any directory:

# Move to permanent location
sudo mkdir -p /usr/local/sqlpackage
sudo cp -r sqlpackage/* /usr/local/sqlpackage/

# Create symbolic link
sudo ln -s /usr/local/sqlpackage/sqlpackage /usr/local/bin/sqlpackage

# Now you can run from anywhere
sqlpackage /Action:Import \
  /SourceFile:"~/path/to/database.bacpac" \
  /TargetServerName:"localhost,5433" \
  /TargetDatabaseName:"DatabaseName" \
  /TargetUser:"sa" \
  /TargetPassword:'YourPassword' \
  /TargetTrustServerCertificate:True

Verifying the Restore

After successful import, verify the database using VS Code with mssql extension or any SQL client:

-- List all databases
SELECT name FROM sys.databases;

-- Check table count in restored database
USE YourDatabaseName;
SELECT COUNT(*) AS TableCount 
FROM INFORMATION_SCHEMA.TABLES 
WHERE TABLE_TYPE = 'BASE TABLE';

Alternative Tools

If SqlPackage doesn’t work for your use case:

  1. Azure Data Studio (Recommended GUI option)

    brew install --cask azure-data-studio
    
    • Right-click “Databases” → “Import Data-tier Application”
    • Browse to your .bacpac file
  2. Direct Docker approach (if SqlPackage is available in container)

    docker cp database.bacpac container_name:/tmp/
    docker exec -it container_name /path/to/sqlpackage /Action:Import ...
    

Additional SqlPackage Actions

SqlPackage supports other database operations:

# Export database to BACPAC
./sqlpackage/sqlpackage /Action:Export \
  /SourceServerName:"localhost,5433" \
  /SourceDatabaseName:"DatabaseName" \
  /SourceUser:"sa" \
  /SourcePassword:'Password' \
  /TargetFile:"output.bacpac" \
  /SourceTrustServerCertificate:True

# Extract to DACPAC (schema only)
./sqlpackage/sqlpackage /Action:Extract \
  /SourceServerName:"localhost,5433" \
  /SourceDatabaseName:"DatabaseName" \
  /SourceUser:"sa" \
  /SourcePassword:'Password' \
  /TargetFile:"schema.dacpac" \
  /SourceTrustServerCertificate:True

Resources

Notes

  • BACPAC files contain both schema and data
  • DACPAC files contain only schema (no data)
  • SqlPackage on Mac M3 runs natively on ARM64 architecture
  • Self-signed certificates in Docker SQL Server require the TrustServerCertificate parameter
  • For production environments, consider using proper SSL certificates
  • Large Tables: If import fails with transport-level errors on specific tables (especially audit/log tables), you can manually remove their data from the BACPAC by unzipping it, deleting the problematic .BCP files, and re-zipping (see section 5 in Common Issues)
  • Docker Resources: For large database imports, ensure Docker has sufficient resources allocated (4GB+ RAM, 2+ CPU cores)
  • Development vs Production: The manual BACPAC modification method is suitable for development environments where audit history is not critical

Enforce Unique Nullable Column With ‘IGNORE_DUP_KEY = ON’

CREATE TABLE [identigy].[patron](
	[ID] [bigint] NOT NULL,
	[DISID] [nvarchar](20) NOT NULL,
	[PKID] [nvarchar](20) NULL,
	[DISHID] [nvarchar](20) NULL,
	[FIRSTNAME] [nvarchar](250) NULL,
	[LASTNAME] [nvarchar](250) NULL,
	[CLASSIFICATION] [nvarchar](250) NULL,
	[FLAGS] [nvarchar](500) NULL,
	[SHIFTS] [nvarchar](50) NULL,
	[ACTION] [nvarchar](10) NULL,
	[ACTIVE] [bit] NOT NULL CONSTRAINT [DEFAULT_Identigy_Patron_Active]  DEFAULT ((1))
	,[DISIDnullbuster] AS (CASE WHEN [DISID] IS NULL THEN CAST(ID AS VARCHAR(20)) ELSE DISID END)
	,[PKIDnullbuster] AS (CASE WHEN [PKID] IS NULL THEN CAST(ID AS VARCHAR(20)) ELSE PKID END)
 ,CONSTRAINT [PK_Identigy_Patron] PRIMARY KEY CLUSTERED ( [ID] ASC )WITH (IGNORE_DUP_KEY = ON) ON [PRIMARY]
 ,CONSTRAINT [UX_Identigy_Patron_DISID] UNIQUE ( [DISIDnullbuster] ASC ) WITH (IGNORE_DUP_KEY = ON) ON [PRIMARY]
 ,CONSTRAINT [UX_Identigy_Patron_PKID] UNIQUE ( [PKIDnullbuster] ASC ) WITH (IGNORE_DUP_KEY = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

SQL Bulk Upload

How can we avoid duplicate values while we are doing bulk insert to a table

--BEGIN TRAN
CREATE TABLE [tlg].[UserVisitLogs](
	[TStamp] 			DATETIME,
	[SessionID] 		NVARCHAR(20),
	[CorrelationID] 	NVARCHAR(20),
	[SchoolCode] 		NVARCHAR(20),
	[SiteExternalID] 	NVARCHAR(20),
	[WebType] 			NVARCHAR(20),
	[WebUrl] 			NVARCHAR(100),
	[UserSid] 			NVARCHAR(50),
	[UserName] 			NVARCHAR(20),
	[UserType] 			NVARCHAR(20),
	[CachedUser] 		NVARCHAR(20)
)
GO

-- Create Unique Clustered Index with IGNORE_DUPE_KEY=ON to avoid duplicate
CREATE UNIQUE CLUSTERED INDEX [ClusteredIndex-Logs] ON [tlg].[UserVisitLogs]
(
	[TStamp]			DESC,
	[SessionID] 		ASC,
	[CorrelationID] 	ASC,
	[SchoolCode] 		ASC,
	[SiteExternalID] 	ASC,
	[WebType] 			ASC,
	[WebUrl] 			ASC,
	[UserSid] 			ASC,
	[UserName] 			ASC,
	[UserType] 			ASC,
	[CachedUser] 		ASC
) WITH (IGNORE_DUP_KEY = ON)
GO
--ROLLBACK TRAN

/*
Thanks to dbatools.io : https://www.youtube.com/watch?v=kQYUrSlb0wg
*/