Importing a BACPAC to SQL Server

We previously looked at Archiving Azure SQL Database to a BACPAC and in today’s post we are going to address how to look at that data by restoring or importing it to a local SQL Server.

As discussed in the previous post this discussion arose in a conversation with a customer who needed to keep monthly archives of their data for a 10 year time period to comply with industry regulations and protect against legal action. This customer is looking to use Azure SQL Database and the built in backup functionality does not cover this entire time period. The concern was around being able to easily restore the archive to a local SQL Server should they decide at some point in the future to no longer use the PaaS solution. Let’s be real though, who would want to leave Azure SQL Database, it’s awesome.

First half of the requirement: archiving the data from Azure SQL Database is complete.
Up next, the other half: taking the archive and access the data using SQL Server

The Data

In the previous post I created a small sample dataset with just three rows of data. You can see that represented in the screenshot below. I have also downloaded the BACPAC from my Azure storage account in preparation for restoring it locally.


Importing the BACPAC

To start, open SQL Server Management Studio (SSMS) and connect to a local instance of SQL Server. Right-click on the instance name and select Import Data-tier Application.


Simply click Next to go back the welcome screen of the import wizard.


Click browse and locate the BACPAC file on your local computer. Click Next.


Alternately, change the radio button to Import from Windows Azure and click Connect. You will be prompted to enter your storage account name and access key and then locate the BACPAC in your storage account. This will be downloaded as part of the import process to a temporary directory that can also be specified in the wizard.


On the database settings page of the wizard the database name, data file storage path and log file storage paths can be modified. The default locations for the data and log files will be pulled from the model database. Click Next.


Click Finish on the Summary page to being the import.


Each step and the status of the operation will be displayed. Assuming all green check marks click Close on the wizard. If there are any errors click the link in the Result column to see the details behind the failure. There should also be a new database in the SQL Server object explorer carrying the same name specified on the Database Settings page of the import wizard.


Running a quick select statement validates the data was imported successfully and shows the same three records that were created in the original Azure SQL Database that was archived in the previous post.


This satisfies the full set of requirements given by the customer:

  • Full backup of the data, archived monthly for 10 years – this can be stored in Azure blob storage and/or downloaded and stored locally
  • Ability to restore the archive at any time – a BACPAC can be imported to Azure SQL Database or to a local SQL Server
  • Maintain data access should the customer decide to no longer leverage Azure SQL Database – BACPAC files can be imported to a local SQL Server instance

Bradley Schacht

Bradley Schacht is a Cloud Solution Architect on the state and local government team with Microsoft based in Jacksonville, FL. He has co-authored 3 SQL Server books including "SQL Server 2014 Professional Administration". As a former consultant and trainer, he uses his experience on many parts of the Microsoft BI and data platform to help customers deliver the best possible solutions. Bradley frequently presents at community events around the country. He is a contributor to sites such as and an active member of the Jacksonville SQL Server User Group (JSSUG).

You may also like...

11 Responses

  1. Yohoho says:

    Maybe a stupid question, what is wrong with a full backup?
    (I dont have any experencie with bacpac’s or azure)

    • We don’t have the ability to take a traditional SQL Server backup in the Azure SQL Database environment (Azure’s managed service). Our only option in today’s world is to do a data export from Azure SQL Database which then creates the BACPAC. Therefore we can’t do a traditional restore of that data into a regular SQL Server, only import the BACPAC.

  2. Meg says:

    HI Bradley,

    I understand this is an old post but I have a quick question we have azure running sql server 2016 as a service, you may be able to help me. I have retrieved my bacpac successfully but I need to restore it on a windows 7 machine. Issue been sql server 2016 will not run on windows 7. Is there any way around this without having to upgreade3 the windows 7 machine?

    Many thanks for any advice offered

    • Windows 7 is only on extended support at this point, so that is likely the reason behind the requirement for Windows 8 or higher. If you want to get around it the best way is likely to setup a virtual machine with Windows 10 either locally or online someplace like Azure. Don’t forget that there is only a 64-bit version of SQL Server 2016; no more 32-bit support.

  3. Angelo Alessandrini says:

    Hi Bradley, I have followed your instruction to restore a bacpac file from Azure but I’m encountering some problems.
    The import does not works, I have received the following error:
    i – Could not load schema from packge (Microsoft.SqlServer.Dac)
    Additional Information: Internal Error. The database platform service with type Microsoft.Data.Tools.Schema.SqlAzureV12DatabaseSchemaProvider is not valid. You must make sure the service is loaded, or you must provide the full type name of a valid database platform service (Microsoft.Data.Tools.Schema.Sql)

    Any ideas?
    Thank you. Angelo

  4. Angelo Alessandrini says:

    Hi Bradley, thank you for your reply. I will tray to test your reccomendations and I wiil give you feed back.
    Kind regards. Angelo

  5. Angelo Alessandrini says:

    Hi Bradley, with Microsoft SQL Server Management Studio and SQL Server 2017, it works…… 🙂 thank you

  6. So we just tried this today on SQL Server 2016 and got errors.
    Could not import package.
    Warning SQL0: The source contains users that rely on an external authentication provider that is not supported by the target. These users will be treated as users without logins.
    Warning SQL72012: The object [db_Data] exists in the target, but it will not be dropped even though you selected the ‘Generate drop statements for objects that are in the target database but that are not in the source’ check box.
    [repeated for _Log]
    Error SQL72014: .Net SqlClient Data Provider:….

    • From where was this bacpac originally exported? And what version of SSMS are you using? I’ve seen a few different places where people were running older versions of SSMS and just updating to the current release and running the import solved the issue.

      • Hi Bradley, thanks for replying.
        After running the specified CONFIGURE command, it got further but died when it encountered missing functions not in SQL Server 2016. Ended up importing into a new SQL Azure instance which worked.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.